The Data Engineer will design and maintain data pipelines, manage Google BigQuery, write SQL/Python, optimize infrastructure, and collaborate with teams.
Position Overview:
ShyftLabs is a rapidly growing data product company founded in 2020. We specialize in building high-impact digital solutions for Fortune 500 companies, focusing on accelerating business growth through technical innovation.
We are seeking a skilled and passionate Data Engineer with deep expertise in Google Cloud Platform (GCP) and Google BigQuery to join our Toronto team. In this role, you will architect, build, and maintain the scalable data pipelines that serve as the foundation for our analytics and data science initiatives. You will own the entire data lifecycle right from ingestion and processing to warehousing and serving, creating a reliable ecosystem that empowers business leaders to make data-informed decisions.
Job Responsibilities:
- Data Architecture & Pipeline Development: Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines using GCP services like Dataflow, Cloud Functions, Pub/Sub, and Cloud Composer.
- Data Warehousing: Develop and manage our central data warehouse in Google BigQuery. Implement data models, schemas, and table structures optimized for performance and scalability.
- Data Processing & Transformation: Write clean, efficient, and robust code (primarily in SQL and Python) to transform raw data into curated, analysis-ready datasets.
- Infrastructure Optimization & Scalability: Monitor, troubleshoot, and optimize our data infrastructure for performance, reliability, and cost-effectiveness. Implement BigQuery best practices, including partitioning, clustering, and materialized views.
- Enable Data Accessibility & BI: Build and maintain curated data models that serve as the "source of truth" for business intelligence and reporting, ensuring data is ready for consumption by BI tools like Looker.
- Data Governance & Quality: Implement automated data quality checks, validation rules, and monitoring to ensure the accuracy and integrity of our data pipelines and warehouse.
- Collaboration: Work closely with software engineers, data analysts, and data scientists to understand their data requirements and provide the necessary infrastructure and data products.
Basic Qualifications:
- Experience: 3-5+ years of hands-on experience in a Data Engineering, Software Engineering, or a similar role.
- Programming Skills: Strong proficiency in a programming language such as Python or Java for data processing and automation.
- Expert SQL Proficiency: Mastery of SQL for complex data manipulation, DDL/DML operations, and query optimization.
- Google BigQuery: Proven expertise in using BigQuery as a data warehouse, including data modeling, performance tuning, and cost management.
- GCP Data Services: Hands-on experience building data pipelines using the GCP ecosystem (e.g., Dataflow, Pub/Sub, Cloud Storage, Cloud Composer/Airflow).
- Data Pipeline Concepts: Deep understanding of ETL/ELT principles and data warehousing architecture (e.g., Star Schema, Data Lakes).
- Engineering Mindset: Strong problem-solving and troubleshooting skills with a focus on building scalable, maintainable, and automated systems.
Preferred Qualifications:
- BI Tool Integration: Experience building data models that power BI tools like Looker (knowledge of LookML is a strong plus), Tableau, or Power BI.
- Modern Data Stack Tools: Experience with tools like dbt, Dataform, or Fivetran for data transformation and integration.
- Infrastructure as Code (IaC): Familiarity with tools like Terraform or Deployment Manager for managing cloud infrastructure.
- Containerization: Knowledge of Docker and Kubernetes is a plus.
- Certifications: Google Cloud Professional Data Engineer certification is highly desirable.
- Version Control: Proficiency with Git for code management and CI/CD pipelines.
Why You’ll Love Working at ShyftLabs
At ShyftLabs, your work matters. We’re a growing data product company making a big impact with Fortune 500 clients and as we scale, you’ll have the chance to shape solutions, influence strategy, and grow your career alongside us.
Here’s what you can expect when you join our team:
-Hybrid Flexibility: Enjoy a hybrid model with 3+ days per week in our Toronto office.
-Downtown Toronto Office: Work in the heart of the city.
-Comprehensive Benefits: We cover 100% of health, dental, and vision insurance premiums for you and your dependents which means no out-of-pocket costs. Eligibility starts from day one itself.
-Growth & Learning: Access extensive learning and development resources to keep leveling up your skills.
Inclusion at ShyftLabs
We’re building something big, and we want you on the journey with us. If you’re ready to use data and innovation to make an impact, apply today and let’s grow together.
ShyftLabs is an equal-opportunity employer committed to creating a safe, diverse, and inclusive environment. We encourage applicants of all backgrounds including ethnicity, religion, disability status, gender identity, sexual orientation, family status, age, and nationality to apply. If you require accommodation during the interview process, let us know and we’ll be happy to support you.
Top Skills
Cloud Composer
Cloud Functions
Dataflow
Dataform
Dbt
Docker
Fivetran
Google Bigquery
Google Cloud Platform (Gcp)
Kubernetes
Looker
Pub/Sub
Python
SQL
Terraform
Similar Jobs
Fintech • Insurance • Financial Services
As a Data Engineer II, you'll perform data analysis, manage data lifecycle, develop ETL frameworks, and support stakeholders with data insight while coordinating with various technology teams.
Top Skills:
AdfAngularAzure Data BrickETLJavaSQL
Financial Services
As a Data Engineer, you will design and implement scalable data pipelines, support AI solutions, optimize performance, and collaborate with stakeholders for data provisioning.
Top Skills:
AWSAzure AiData FactoryFabricGithub ActionsAzurePythonSparkSQLTerraform
Financial Services
Oversees data management methods, ensures data quality, manages integration and cloud platforms, optimizes data processes, and provides analytical support within the organization.
Top Skills:
Big DataCloud ComputingData Architecture DesignData IntegrationData MiningData SecurityData WarehousingDeep LearningMachine LearningStreaming
What you need to know about the Ottawa Tech Scene
The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.


