TULLOCH Logo

TULLOCH

Data Engineer

Reposted 24 Days Ago
Be an Early Applicant
In-Office
Ottawa, ON
Senior level
In-Office
Ottawa, ON
Senior level
The Data Engineer designs and develops data solutions using Microsoft Fabric. Responsibilities include data architecture, ETL/ELT development, data quality governance, and collaboration with stakeholders.
The summary above was generated by AI
Come Join Us!
“We want to build an organization where everyone loves their job and their leaders care for them.”
Over the last 30 years, TULLOCH has built a robust multi-disciplinary consulting engineering firm recognized Canada-wide for its strengths in the diverse service offerings and commitment to excellence. TULLOCH’s innovative use of emerging technologies to improve both the efficiency and quality of work is core to everything TULLOCH. This approach, along with our extreme work ethic, makes us a service provider of choice for many clients.
Based in Ottawa, Ontario the Microsoft Fabric Data Engineer who will report to the Director of Technology and will have a wide range of duties which ultimately focused on designing, developing, and optimizing data solutions using Microsoft Fabric, ensuring scalable, reliable, and secure data pipelines that support advance analytics, business intelligence, and AI/ML workloads. Working closely with business stakeholders to deliver actionable insights that drive business outcomes.
What You’ll Do:
As part of the TULLOCH Engineering team, you will:
  • Data Architecture & Modeling
    • Design and implement modern data architectures within Microsoft Fabric, leveraging Lakehouse, Data Warehousing, and OneLake features.
    • Develop scalable data models optimized for analytics and reporting with Power BI integrations.
  • ETL / ELT Development
    • Build, automate, and optimize data pipelines using Data Factory, Synapse, and Fabric Dataflows.
    • Ensure efficient ingestion, transformation, and curation of structured and unstructured data from diverse sources (on-prem, cloud, APIs).
  • Data Quality & Governance
    • Implement and enforce data governance, cataloging, lineage, and security policies in Fabric.
    • Ensure data accuracy, consistency, and compliance with internal and regulatory standards (e.g., GDRP, HIPAA).
  • Performance & Optimization
    • Optimize data processing and query performance for larger-scale datasets.
    • Apply best practices for partitioning, indexing, and workload management in Fabric environments.
  • Collaboration & Leadership
    • Work closely with business stakeholders to understand requirements and translate them into solutions.
    • Mentor junior engineers and contribute to the establishment of best practices and coding standards.
Please note that this job description is not meant to be an all-inclusive statement of every duty and responsibility that will ever be required of an employee in the job.
Who You Are:
  • 6+ years of experience in data engineering, with at least 2+ years working hands-on with Microsoft Fabric and Azure Data Services.
  • Strong expertise in Microsoft Fabric, including Lakehouse, OneLake, Data Engineering, and Synapse Data Warehousing.
  • Strong background in Linux/Unix system administration and scripting (Bash, Python, PowerShell).
  • Proven experience with CI/CD tools (Jenkins, GitLab CI, GitHub Actions, Azure DevOps, etc.).
  • Experience with AI/ML model integration within Microsoft Fabric.
  • Hands-on experience with real-time streaming solutions (Event Hubs, Kafka, etc.)
  • Understanding of networking, firewalls, load balancers, and security best practices.
  • Experience with cloud platforms (AWS, Azure, GCP) in production environments.
  • You can exercise independent judgment within defined parameters and in alignment with business objectives.
  • You are willing to travel periodically as needed.
What You Should Bring:
The ideal candidate will have an Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field. Additionally, you should have:
  • Proficiency in SQL, PySpark, Python, and DAX.
  • Experience with ETL/ELT pipelines (Data Factory, Synapse Pipelines, or similar).
  • Knowledge of Power BI integration for reporting and analytics.
  • Familiarity with Azure ecosystem: Azure Data Lake, Azure Databricks, Azure Synapse, Azure Purview.
  • Experience with data governance, lineage, and security frameworks.
  • Strong leadership ability in cross-functional and geographically distributed teams.
  • Strong analytical and problem-solving abilities.
  • Excellent communication skills to bridge technical and business teams.
  • Ability to work in agile, collaborative environments.
  • Valid Class G driver’s licence.

What We Offer You:
TULLOCH has built a passionate workforce with a strong and vibrant culture which has been the key to our success. We offer programs and rewards that one would expect from a highly successful, established, and growing engineering company:
  • Competitive salary, benefits plan, and company pension plan
  • A fantastic culture, team, and energetic work environment
  • Social activities, company sponsored events, and opportunities to give back to our local community
  • Flexible working hours
  • Coaching and mentoring programs
  • Scholarship programs for family members
  • Opportunities to travel and work across Canada
  • Hybrid working options

TULLOCH is an equal opportunity employer that is committed to acquiring a skilled and diverse workforce. We encourage applications from candidates of all backgrounds, origins, ages, orientations, genders, creeds, and religions. TULLOCH accommodates people with disabilities throughout the recruitment and selection process. TULLOCH is an excellent place to work and we look forward to meeting with you! If contacted regarding this competition, please advise the interview coordinator of any accommodation measures you may require.
 

Top Skills

Azure Data Lake
Azure Data Services
Azure Databricks
Azure Devops)
Azure Purview
Azure Synapse
Ci/Cd Tools (Jenkins
Data Factory
Dax
Event Hubs
Github Actions
Gitlab Ci
Kafka
Microsoft Fabric
Power BI
Pyspark
Python
SQL
Synapse

Similar Jobs

22 Days Ago
Hybrid
Toronto, ON, CAN
Senior level
Senior level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
The Principal Data Engineer will lead modernization of data platforms, drive ETL development, ensure data governance, and collaborate with cross-functional teams.
Top Skills: Apache AirflowAws GlueAzure Data FactoryBigQueryDagsterDbtKafkaNifiOraclePythonSnowflakeSparkSQL
Yesterday
In-Office or Remote
Toronto, ON, CAN
Senior level
Senior level
AdTech • Digital Media • eCommerce • Marketing Tech
The Senior Data Engineer will design and build scalable data pipelines using Databricks and Apache Spark to support Fluent's business. This includes integrating with Kafka and ensuring data quality and performance across the data lifecycle.
Top Skills: Aws (S3Ci/CdDatabricksDelta LakeGitIamKafkaPysparkSecrets Manager)Spark
Yesterday
In-Office
Toronto, ON, CAN
Mid level
Mid level
Financial Services
The Associate will develop data pipelines, manage cloud infrastructure, ensure data quality, troubleshoot issues, and collaborate with investment partners on innovative data solutions.
Top Skills: AirflowAWSDatabricksPythonSQL

What you need to know about the Ottawa Tech Scene

The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account