Mars Logo

Mars

Data Engineer

Posted 18 Hours Ago
Remote
10 Locations
Senior level
Remote
10 Locations
Senior level
The Data Engineer will develop and maintain data assets and pipelines, utilizing technologies like Python, Spark, and Azure. Responsibilities include collaborating on data solutions, implementing best practices in data engineering, and supporting data management across various divisions. The role emphasizes continuous learning and contribution to innovative data solutions in a cloud environment.
The summary above was generated by AI

Job Description:

This role is an incubation role (temporary) with an estimated end date of December 2026. The purpose is to fast-track and support the build of this specific product. At the completion of the product, a permanent BAU role will open to maintain and support the product: the role will be permanent and will have a different job description more suited to the need of the organisation at end state. If you are unable to secure the role by December 2026, you may be eligible to receive separation benefits consistent with Company policies and practice.

As part of our Pet Nutrition digital first strategy, our purpose is to establish strong Digital & Data Foundations (DDF) for PN products through transversal foundational technology & data capabilities that enable the creation of scalable fit-for-purpose solutions, deliver superior propositions and fuel integrated supply chain, with Increased agility and reduced cost. Some of the key deliverables we will look to unlock are:

  • From siloed, small-scale product performance evaluation to a fully Integrated, data driven performance assessment through real-time predictive solutions; maximising capability and reducing evaluation time & cost by building confidence to drive superior propositions across the portfolio.
  • From traditional Innovation and scale-up protocols to innovative digital modelling which enables rapid scenario evaluation, accelerated development and scale-up, with increased agility and reduced costs/resources.
  • From dispersed physical quality records to digitalised quality standards, capturing of data and trend predictions, which can be leveraged in order to proactively mitigate emerging risks and avoid non-quality Impacts.
  • From fragmented legacy IT systems holding unreliable data to an integrated R&D and SUPPLY digital & data ecosystem with respective sub-domains to enact step-change operational efficiency and maximize business value by confidently utilising trustworthy data.

What are we looking for?

  • 5+ years as a Data Engineer
  • Experience with Spark, Databricks, or similar data processing tools.
  • Proficiency in working with the cloud environment and various software including SQL Server, Hadoop, and NoSQL databases.
  • Proficiency in Python (or similar), SQL and Spark.
  • Proven ability to develop data pipelines (ETL/ELT).
  • Strong inclination to learn and adapt to new technologies and languages.
  • Expertise in designing and building Big Data databases, analytics, and BI platforms.
  • Strong understanding and experience in working with Databricks Delta Lake.
  • Keen interest in the latest trends and tools in data engineering and analytics.
  • Familiarity with graph databases (e.g., Neo4J/Cypher).
  • Experience with data visualization tools (e.g., PowerBI).
  • Proficiency in Microsoft Azure cloud technologies would be a bonus.

What will be your key responsibilities?

As a Data Engineer in the DDF team, your key responsibilities are as follows:

1. Technical Proficiency:

  • Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and maintain data assets and reports for business insights.
  • Assist in engineering and managing data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL.
  • Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs.
  • Support the implementation of DevOps and CI/CD methodologies to foster agile collaboration and contribute to building robust data solutions.
  • Develop code that adheres to high-quality standards, promoting a scalable and maintainable data platform.

2. Learning and Growth; Contribution to Solutions:

  • Collaborate with the team to learn and apply the best practices in data engineering.
  • Actively participate in engineering projects, gaining experience in developing high-quality, scalable, and sustainable data solutions.
  • Stay updated with emerging technologies and trends in data engineering, contributing to the team's knowledge base by sharing insights and ideas.
  • Assist in the development of data solutions within the Pet Nutrition data platform, working on challenging aspects under the guidance of senior team members.
  • Contribute to the management of data from various divisions to generate valuable data assets related to pets and pet owners.
  • Support the maintenance of a semantic and intelligent data layer to contribute to the comprehensive leadership of the data solution within the environment.

3. Collaboration and Communication:

  • Collaborate closely with analysts, data scientists, and team members to understand their requirements and assist in translating them into actionable data solutions.
  • Maintain effective communication with the Data Engineering Lead, actively participating in team discussions and sharing ideas to improve platform excellence.

What can you expect from Mars?

  • Work with diverse and talented Associates, all guided by the Five Principles.
  • Join a purpose driven company, where we’re striving to build the world we want tomorrow, today.
  • Best-in-class learning and development support from day one, including access to our in-house Mars University.
  • An industry competitive salary and benefits package, including company bonus.

#TBDDT

Mars is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. If you need assistance or an accommodation during the application process because of a disability, it is available upon request. The company is pleased to provide such assistance, and no applicant will be penalized as a result of such a request.

Top Skills

Azure
Databricks
Delta Lake
Hadoop
Nosql Databases
Power BI
Pyspark
Python
Spark
SQL
SQL Server

Similar Jobs

6 Hours Ago
Remote
Hybrid
Bengaluru, Karnataka, IND
Junior
Junior
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
As a Data Engineer, you will maintain and optimize database systems, ensuring their performance and security while collaborating with teams to implement new features and resolve issues. You will analyze system integrity and lead database development adhering to policies and deadlines throughout the software development lifecycle.
10 Days Ago
Remote
Bengaluru, Karnataka, IND
Junior
Junior
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Data Engineer at Atlassian, you'll build and manage data pipelines and services, work with stakeholders to improve data ingestion into the data lake, and create efficient micro-services. Your technical expertise will help enhance a multi-petabyte data lake using a variety of tools including Spark and AWS services.
Top Skills: AirflowAWSEmrHiveJavaKinesisPrestoPythonRdsS3ScalaSparkSQLSqs
15 Days Ago
Remote
Hybrid
India
Junior
Junior
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Jr. Data Engineer at Capco will develop and design data solutions using Pyspark or Scala, work with scheduling tools like Airflow, and build data pipelines using Hadoop components. The role requires knowledge of big data modeling and experience with version control, deployment tools, and debugging. Good to have skills include Java development and familiarity with cloud patterns and Agile methodologies.
Top Skills: AirflowAnsibleApache HadoopSparkElastic SearchEtl FrameworksGitGitHiveJavaJenkinsJIRAMap ReducePysparkPythonRestful ServicesScalaSQLUnix/LinuxYarn

What you need to know about the Ottawa Tech Scene

The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account