As a Senior Data Engineer, you'll design, build, and deploy modern data infrastructure, collaborate across teams, optimize data workflows, and ensure system reliability in a dynamic environment.
At Wave, we help small businesses to thrive so the heart of our communities beats stronger. We work in an environment buzzing with creative energy and inspiration. No matter where you are or how you get the job done, you have what you need to be successful and connected. The mark of true success at Wave is the ability to be bold, learn quickly and share your knowledge generously.
Reporting to the Senior Manager of AI & Data Platform, as a Senior Data Engineer, you will be building tools and infrastructure to support the efforts of the Data Products and Insights & Innovation teams, and the business as a whole.
We’re looking for a talented, curious self-starter who is driven to solve complex problems and can juggle multiple domains and stakeholders. This highly technical individual will collaborate with all levels of the Data and AI team as well as various engineering teams to develop data solutions, scale our data infrastructure, and advance Wave to the next stage in our transformation as a data-centric organization.
This role is for someone with proven experience in complicated product environments. Strong communication skills are a must to bridge the gap between technical and non-technical audiences across a spectrum of data maturity.
At Wave, you’ll have the chance to grow and thrive by building scalable data infrastructure, enhancing a modern data stack, and contributing to high-impact projects that empower insights and innovation across the company.
Here’s How You Make an Impact:
- You’re a builder. You’ll be responsible for designing, building, and deploying the components of a modern data stack, including CDC ingestion (using Meltano or similar tools), a centralized Iceberg data lake, and a variety of batch, incremental, and stream-based pipelines.
- You’ll make things better. You enjoy the challenge of helping build and manage a fault-tolerant data platform that scales economically while balancing innovation with operational stability by maintaining legacy Python ELT scripts and accelerating the transition to dbt models in Redshift, Snowflake, or DataBricks.
- You’re all about collaboration and relationships. You will collaborate within a cross-functional team in planning and rolling out data infrastructure and processing pipelines that serve workloads across analytics, machine learning, and GenAI services. You enjoy working with different teams across Wave and helping them succeed by ensuring that their data, analytics, and AI insights are reliably delivered.
- You’re self-motivated and can work autonomously. We count on you to thrive in ambiguous conditions by independently identifying opportunities to optimize pipelines and improve data workflows under tight deadlines.
- You will resolve and mitigate incidents. You will respond to alerts and proactively implement monitoring solutions to minimize future incidents, ensuring high availability and reliability of data systems.
- You're a strong communicator. As a data practitioner, you’ll have people coming to you for technical assistance, and your outstanding ability to listen and communicate with people will reassure them as you help answer their concerns.
- You love helping customers. You will assess existing systems, optimize data accessibility, and provide innovative solutions to help internal teams surface actionable insights that enhance external customer satisfaction.
You Thrive Here By Possessing the Following:
- Data Engineering Expertise: Bring 6+ years of experience in building data pipelines and managing a secure, modern data stack. This includes CDC streaming ingestion using tools like Meltano or similar for data ingestion workflows that support AI/ML workloads, and a curated data warehouse in Redshift, Snowflake, or DataBricks.
- AWS Cloud Proficiency: At least 3 years of experience working with AWS cloud infrastructure, including Kafka (MSK), Spark / AWS Glue, and infrastructure as code (IaC) using Terraform.
- Strong Coding Skills: Write and review high-quality, maintainable code that enhances the reliability and scalability of our data platform. We use Python, SQL, and dbt extensively, and you should be comfortable leveraging third-party frameworks to accelerate development.
- Data Lake Development: Prior experience building data lakes on S3 using Apache Iceberg with Parquet, Avro, JSON, and CSV file formats.
- Workflow Automation: Experience with Airflow or similar orchestration systems to build and manage multi-stage workflows that automate and orchestrate data processing pipelines.
- Data Governance Knowledge: Familiarity with data governance practices, including data quality, lineage, and privacy, as well as experience using cataloging tools to enhance discoverability and compliance.
- CI/CD Best Practices: Experience developing and deploying data pipeline solutions using CI/CD best practices to ensure reliability and scalability.
- Data Integration Tools: Working knowledge of tools such as Stitch and Segment CDP for integrating diverse data sources into a cohesive ecosystem.
- Analytical and ML Tools Expertise: Knowledge and practical experience with Athena, Redshift, or Sagemaker Feature Store to support analytical and machine learning workflows is a definite bonus!
Succeeding at Wave: At Wave, you’ll have the chance to grow and thrive by building scalable data infrastructure, enhancing a modern data stack, and contributing to high-impact projects that empower insights and innovation across the company. Whether collaborating in our vibrant downtown Toronto hub or working remotely, you’ll have the flexibility to shape your journey and make a lasting impact on Wave’s data-driven future. At Wave, we value diverse perspectives and encourage open, respectful feedback, fostering an inclusive environment where innovation flourishes, and every team member has the opportunity to grow.
At Wave, we value diversity of perspective. Your unique experience enriches our organization. We welcome applicants from all backgrounds. Let’s talk about how you can thrive here!
Wave is committed to providing an inclusive and accessible candidate experience. If you require accommodations during the recruitment process, please let us know by emailing [email protected]. We will work with you to meet your needs.
Top Skills
Apache Hudi
Athena
Avro
AWS
Aws Glue
Aws Step Functions
Csv
Dbt
Debezium
Hudi
JSON
Kafka
Msk
Parquet
Python
Redshift
S3
Sagemaker
Segment
Serverless Lambdas
Spark
SQL
Stitch
Terraform
Similar Jobs
Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
Responsible for designing and maintaining data pipelines, collaborating with data scientists and analysts, and building analytical models to inform decision-making in People Analytics.
Top Skills:
DatabricksPythonSparkSQL
Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
As a Senior Data Engineer, you will develop and maintain end-to-end data pipelines, manage data integration, and enhance data architecture while collaborating across teams in a fast-paced environment.
Top Skills:
AWSAws CloudwatchAzureDatabricksDatadogDbtDockerFivetranGCPKubernetesMs Sql-ServerMySQLOraclePostgresPythonSnowflakeSplunkSQL
Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
As a Senior Data Platform Engineer, you'll build scalable data infrastructure, optimize data workflows, develop APIs, and monitor data platform reliability. Collaborate across teams to deliver impactful data solutions for Samsara's products.
Top Skills:
AWSDatabricksDelta LakeGoKinesisLambdaS3SparkSqsTerraform
What you need to know about the Ottawa Tech Scene
The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.