As a Senior Data Engineer, you will architect reliable data pipelines, perform data analysis, mentor junior engineers, and manage cloud data infrastructure, ensuring data governance and compliance while collaborating with cross-functional teams.
At Netomi AI, we are on a mission to create artificial intelligence that builds customer love for the world’s largest global brands.
Some of the largest brands are already using Netomi AI’s platform to solve mission-critical problems. This would allow you to work with top-tier clients at the senior level and build your network.
Backed by the world’s leading investors such as Y-Combinator, Index Ventures, Jeffrey Katzenberg (co-founder of DreamWorks) and Greg Brockman (co-founder & President of OpenAI/ChatGPT), you will become a part of an elite group of visionaries who are defining the future of AI for customer experience. We are building a dynamic, fast growing team that values innovation, creativity, and hard work. You will have the chance to significantly impact the company’s success while developing your skills and career in AI.
Want to become a key part of the Generative AI revolution? We should talk.
We are looking for a Senior Data Engineer with a passion for using data to discover and solve real-world problems. You will enjoy working with rich data sets, modern business intelligence technology, and the ability to see your insights drive the features for our customers. You will also have the opportunity to contribute to the development of policies, processes, and tools to address product quality challenges in collaboration with teams.
What You’ll Do
- Architect and implement scalable, secure, and reliable data pipelines using modern data platforms (e.g., Spark, Databricks, Airflow, Snowflake, etc.).
- Develop ETL/ELT processes to ingest data from various structured and unstructured sources.
- Perform Exploratory Data Analysis (EDA) to uncover trends, validate data integrity, and derive insights that inform data product development and business decisions.
- Collaborate closely with data scientists, analysts, and software engineers to design data models that support high-quality analytics and real-time insights.
- Lead data infrastructure projects including management of data on cloud platforms (AWS/Azure), data lake/warehouse implementations, and data quality frameworks.
- Ensure data governance, security, and compliance best practices are followed.
- Monitor and optimize the performance of data systems, addressing any issues proactively.
- Mentor junior data engineers and contribute to establishing best practices in data engineering standards, tooling, and development workflows.
- Stay current with emerging technologies and trends in data engineering and recommend improvements as needed.
Required Qualifications
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 8+ years of hands-on experience in data engineering or backend software development roles.
- Proficiency with Python, SQL, and at least one data pipeline orchestration tool (e.g., Apache Airflow, Luigi, Prefect).
- Strong experience with cloud-based data platforms (e.g., AWS Redshift, GCP BigQuery, Snowflake, Databricks).
- Deep understanding of data modeling, data warehousing, and distributed systems.
- Experience with big data technologies such as Apache Spark, Kafka, Hadoop, etc.
- Familiarity with DevOps practices (CI/CD, infrastructure as code, containerization with Docker/Kubernetes).
Preferred Qualifications
- Experience working with real-time data processing and streaming data architectures.
- Knowledge of data security and privacy regulations (e.g., GDPR, HIPAA).
- Exposure to machine learning pipelines or supporting data science workflows.
- Familiarity with prompt engineering and how LLM-based systems interact with data.
- Experience working in cross-functional teams and with stakeholders from non-technical domains.
Netomi is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.
Top Skills
Apache Airflow
Spark
Aws Redshift
Databricks
Docker
Gcp Bigquery
Hadoop
Kafka
Kubernetes
Python
Snowflake
SQL
Similar Jobs
Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
As a Senior Data Engineer, you will develop and maintain end-to-end data pipelines, manage data integration, and enhance data architecture while collaborating across teams in a fast-paced environment.
Top Skills:
AWSAws CloudwatchAzureDatabricksDatadogDbtDockerFivetranGCPKubernetesMs Sql-ServerMySQLOraclePostgresPythonSnowflakeSplunkSQL
Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
Responsible for designing and maintaining data pipelines, collaborating with data scientists and analysts, and building analytical models to inform decision-making in People Analytics.
Top Skills:
DatabricksPythonSparkSQL
Artificial Intelligence • Machine Learning • Retail • Social Impact • Software
As a Staff Data Engineer, you will design and implement ETLs for large datasets, improve customer data integration, and mentor team members.
Top Skills:
DatabricksDbtPysparkPythonSnowflakeSQL
What you need to know about the Ottawa Tech Scene
The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.