MetLife logo, now hiring for IT positions
MetLife Logo

MetLife

Sr Data Engineer

Reposted 2 Hours Ago
Remote or Hybrid
Hiring Remotely in México
Senior level
Remote or Hybrid
Hiring Remotely in México
Senior level
The Senior Data Engineer will design and maintain ETL pipelines, optimize data processes, manage infrastructure using Terraform, and collaborate with cross-functional teams on data solutions.
The summary above was generated by AI
Description and Requirements
Data Engineer, LATAM Data Hub
Role Value Proposition:
At LATAM Data Hub (LDH), our mission is to build the next generation data lakehouse for MetLife, and to help deploy it across various LATAM countries. We have developed a world-class, cloud-native platform, to enable reporting, analytics, data supply pipeline, and real time supply of the data to various digital and non digital channels. The platform leverages cutting-edge, open source and proprietary technologies to create a highly configurable system that can be adapted to individual market needs quickly, and at a low cost. The platform runs in a fully containerized, elastic cloud environment, and is designed to scale to serve millions of users.
We are looking for a Senior Data Engineer with a track record of designing and implementing large and complex technology projects at a global scale. The ideal candidate would have a solid foundation in hands-on ETL and analytical warehouse development, understand complexities in managing end to end data pipelines and in-depth knowledge of data governance and data management concepts. To be successful in this role, the candidate would require a balance of product-centric technical expertise and navigating complex deployments with multiple systems and teams. This role requires interaction with technical staff and senior business and IT partners around the world. This position is also responsible for ensuring operational readiness by incorporating configuration management, exception handling, logging, end-to-end batch and real-time data pipeline operationalization for getting data, managing and processing into the hub.
Key Responsibilities:
• Design, build and maintain efficient and scalable extract, transform, load (ETL) pipelines to support business requirements.
• Analyze and optimize data pipelines to enhance execution speed and reliability through the development of quality code.
• Collaborate with DevOps team to implement robust monitoring and alerting solutions for critical workflows.
• Implement and manage infrastructure using Terraform (or similar) to ensure consistency and scalability.
• Interact closely with stakeholders, including Data Scientists, Developers, Analyst and Business Teams to align technical solutions with business goals.
• Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies.
• Ingesting huge volumes data from various platforms for needs and writing high-performance, reliable, and maintainable ETL, ELT code
• Provide technical support, investigate and resolve production issues in data pipelines, ensuring minimal disruption to operations.
Essential Business Experience and Technical Skills:
Required
  • 5+ years of ETL and data warehousing development experience
  • 3+ plus years of experience designing ETL, ELT and data lakes on cloud or big data based platforms
  • Demonstrated experience with implementing, and deploying scalable and performant data hubs at global scale
  • Demonstrated experience in cutting-end database technologies, and cloud services such as Azure, GCP, Azure, Data Bricks or SnowFlake; deep experience in technologies such as Spark(Scala/python/Java), ADLS, Kafka, SQL, Synapse SQL, Cosmos DB, Graph DBs or others
  • Hands on expertise in: Implementing analytical data stores such as Azure platform using ADLS/Azure data factory /Data Bricks and Cosmos DB (mongo/graph API), or others
  • Strong analytic skills related to working with unstructured datasets
  • Strong problem-solving abilities, and effective collaboration in cross functional teams.
  • Strong Python language knowledge.
  • Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance.
  • 3+ years of experience with Terraform for infrastructure provisioning.
  • E agerness to learn new technologies on the fly and ship to production
  • Excellent communication skills: Demonstrated ability to explain complex technical content to both technical and non-technical audiences
  • Experience working enterprise complex large scale, multi-team, programs in agile development methodologies
  • Experience in Solution implementation, performance testing and tuning; ADLS, Synapse SQL database or GCP Big Query and GCS management and Performance tuning - Partitioning / Bucketing.
  • Knowledge on computational complexity
  • Bachelor's degree in computer science or related field

Preferred :
  • Working knowledge of English
  • Experience in Data Warehouse projects.

Beneficios que Ofrecemos
Nuestros beneficios están diseñados para cuidar su bienestar holístico con programas para la salud física y mental, el bienestar financiero y el apoyo para las familias. Ofrecemos seguro de gastos médicos mayores, seguro de vida en combinación con un paquete de compensación competitivo junto con bonificaciones por rendimiento, fondo de ahorro y plan de pensiones. También ofrecemos permisos parentales y de adopción ampliados, así como beneficios adicionales como tiempo libre de voluntariado, días libres por su cumpleaños y el Día del Patrimonio Cultural, eventos culturales y deportivos, ¡y mucho más!
Acerca de MetLife
Reconocida en la lista de la revista Fortune de las "Empresas más admiradas del mundo" de 2025 y en la lista de los 25 mejores lugares para™ trabajar en el mundo de Fortune para 2025, MetLife, a través de sus subsidiarias y afiliadas, es una de las principales empresas de servicios financieros del mundo; proporcionando seguros, anualidades, beneficios para empleados y gestión de activos a clientes individuales e institucionales.
Nuestro propósito es simple: ayudar a nuestros colegas, clientes, comunidades y al mundo en general a crear un futuro más seguro.Unidos por un propósito y guiados por nuestros valores fundamentales - Ganar Juntos, Hacer lo Correcto, Entregar Impacto sobre Actividad y Pensar en el Futuro - estamos inspirados para transformar el próximo siglo en los servicios financieros. En MetLife, es #AllTogetherPossible ¡Únete a nosotros!
At MetLife, we are committed to fostering diversity among employees, through non-discriminatory treatment for reasons of gender, gender expression, sexual orientation, religion, age, nationality, marital status, disability, physical or economic condition, HIV and embarrassment as a requirement for entry, permanence or ascension and there is equal employment opportunities.
#BI-Hybrid

Top Skills

Azure
Cloud Technologies
Cosmos Db
Data Bricks
Data Lakes
Elt
ETL
GCP
Graph Dbs
Java
Python
Scala
Snowflake
Spark
SQL
Synapse Sql
Terraform

What you need to know about the Ottawa Tech Scene

The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account