We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (17000+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!
Job DescriptionA) Provide Data Engineering Support
- Provide services to complete Data Engineering task as assigned via Team Lead, Manager, Director or others
- Provide Data Engineering expertise to build upon tasks and capabilities defined
B) Manage Data Operations
- Monitor and respond to scheduled workloads that feed data from and to the data platform
- Restart and/or repair the workflows when necessary
- Update or create runbooks for future troubleshooting
- Closely follow the run schedules and changes to schedules during maintenance windows to make sure workloads are executed after the maintenance window is complete
- Manage a list of workloads and users that are upstream and downstream to the workloads
- Assess the impacts of schedule/run changes, processing issues or data quality issues
- Communicate and work with Suppliers on schema drift and/or other Supplier issues
- Communicate to stakeholders, delays, or impacts of failures and escalate as needed to Leadership, EOC
- Create and execute quality scripts to monitor and maintain the accuracy of our data
- Respond and resolve PagerDuty alerts or Service Now ticket assignment
- Monitor Regular KTLO activities to maintain good health of the services.
- Monitor mirroring queues for any abnormal behavior and resolve.
- Understand the failures and changes in schedules and assess business impacts.
C) Develop and enhance data pipelines.
- Configure pipelines to ingest data from supplier landing zone to the data platform that includes configuration of airflow ingestion pipelines and/or Snowflake external tables/Snow pipe.
- Configure pipelines to ingest and process data from supplier landing zone to the data platform that includes configuration of airflow ingestion pipeline and Snowflake as directed by the technical lead and/or Scrum Master.
D) Snowflake Administration & Operations
E) Adhere to and exceed KPI's to be defined and provide recommendations and implement process/technology improvements.
QualificationsSkills Required : AWS (S3, EMR, EC2, Lambda, Glue, Redshift, Athena), Snowflake, Python, Airflow, DMS. (Key AWS services for data engineers)
Good working experience in Cloud Data Engineering - Support, Operations & Enhancement projects.


