Position: Cloud Data Engineer
Location: Toronto, ON
Job ID#: RQ11099
Duration: 16 Months
Role Overview
We are seeking a skilled DataOps / Cloud Data Engineer to design, build, and support modern cloud‑based data platforms. This role
focuses on developing scalable data pipelines, transforming complex datasets,
and enabling analytics through robust data warehouse and lakehouse
architectures in cloud environments.
Key Responsibilities
- Design,
develop, test, implement, and troubleshoot end‑to‑end data pipelines
- Build
and maintain complex data transformation processes
- Develop
and optimize data models for data warehouses and lakehouse platforms
- Work
with structured and unstructured data from multiple sources
- Support
full SDLC activities including requirements gathering, design, testing,
deployment, and production support
- Collaborate
with stakeholders using agile, scrum, or waterfall delivery models
- Participate
in change, incident, and production support processes
Qualifications & Skills
Data Engineering (40%)
- Strong
programming and scripting skills: Python, SQL, Linux shell, PowerShell
- Data
manipulation and analysis using pandas and PySpark
- Experience
working with XLSX, CSV, JSON, relational databases, and cloud
storage
Cloud & Lakehouse Experience (25%)
- Hands‑on
experience with AWS and/or Azure
- Cloud
data warehouse platforms (e.g., AWS Redshift)
- Data
lakehouse platforms such as Databricks (Delta Lake)
- Data
orchestration and automation solutions
Data Warehousing & ETL (25%)
- Experience
with ETL tools such as Azure Data Factory, AWS Glue, and cloud‑agnostic
platforms like Informatica IDMC
- Strong
knowledge of data modeling and architecture (relational and
dimensional)
- Data
reporting and visualization exposure
Additional Skills (10%)
- Full
SDLC experience
- Agile/Scrum
and Waterfall methodologies
- Strong
communication, presentation, and stakeholder engagement skills
- Consulting
mindset with strong problem‑solving and decision‑making abilities
- Experience
working in Ontario Public Sector (OPS) or public sector
environments
MUST‑HAVE REQUIREMENTS
- Data
engineering using pandas and PySpark
- Hands‑on
experience with Databricks / Delta Lake
- Strong
experience handling structured and unstructured data
- AWS
services including Glue, Step Functions, Lambda, and S3
- ETL
development using tools such as Informatica IDMC
About Symbiotic Digital
Symbiotic Digital provides the IT Experts you need that
solve problems and get things done. See what the top 2% in their field can do
for you. (Symbiotic Digital is a division of Symbiotic Group Inc.)
We serve customers in two ways:
- Digital
Experts: IT consulting expertise — Bright Minds That Produce Proven
Results
- Recruitment:
Find The Right People You Need
Serving IT & Business leaders, Symbiotic Digital enables
organizations to solve complex business and technology challenges by providing
proven technical experts evaluated through our QMS – Quality Management System
Staff Development Model. We are a 100% Indigenous‑owned company.
Learn more: https://www.symbioticgroup.com/home-sd/


