The expert behind unlocking hidden insights and data-driven success.
As a Senior Data Analytics Developer, you will be a key contributor to the development of our data platform, focusing on building foundational data models, creating data marts to enable business insights, and enhancing data observability and monitoring solutions. Leveraging advanced technologies such as Spark, dbt, and data warehouses, you'll design high-performance, scalable data pipelines and frameworks.
What’s exciting about this role?
Your efforts will directly impact the discovery of valuable product insights and support the creation of data-driven product features.
Here is a glimpse of your responsibilities:
- Collaborate with cross-functional teams to define data requirements and implement efficient data pipelines for comprehensive analysis of user interactions within applications.
- Build scalable data pipelines that can handle hundreds of millions of events daily.
- Optimize data processing by setting up pre-calculations in data storage solutions to enhance the performance of reporting dashboards.
- Continuously improve system performance and ensure the analytics tools are current with industry standards.
- Provide support to team members by assisting with data extraction and queries analysis, fostering a collaborative environment.
- Coordinate with teams responsible for product integration to ensure adherence to data standards and protocols across applications.
Here is what will qualify you for the role:
- You have 5 years of experience in a similar role, with expertise in building data pipelines, data models, and analytics for large-scale distributed systems.
- You possess a strong engineering background, with hands-on experience in Python, dbt, Snowflake, and/or Databricks.
- You have in-depth knowledge of SQL, ETL processes, data warehousing, and Data Lake architecture.
- You have experience with automated testing frameworks and a commitment to code quality, including unit testing, integration testing, and code coverage metrics.
- You excel at working collaboratively with cross-functional teams and possess exceptional written and verbal communication skills on both technical and non-technical subjects.
- You have experience with Cloud Providers: Proficiency in AWS, Google Cloud Platform, or Azure.
What would make you stand out:
- Experience with PySpark and structured streaming.
- Experience with orchestrating complex workflows using tools such as Airflow, Dagster or Prefect.
- Familiarity with infrastructure as code and with CI/CD tools such as Jenkins, Github Actions, etc to automate testing, deployment, and integration processes.
- Hands-on experience with containerizing applications and managing them at scale using orchestration platforms.
Do you think you can bring this role to life? Or add your own color? You don’t need to check every single box; passion goes a long way and we appreciate that skillsets are transferable.
Send us your application, we want to hear from you! / Send us your application, we want to know what you're all about!
Join the Coveolife!
We encourage all qualified candidates to apply regardless of, for example, age, gender, disability, gaps in CV, national or ethnic background.
#li-hybrid