Provectus Logo

Provectus

Senior Data Engineer

Reposted 4 Days Ago
Remote
3 Locations
Senior level
Remote
3 Locations
Senior level
Join Provectus as a Senior Data Engineer to design and implement data pipelines, collaborate with clients, manage data, and build APIs using Python and AWS.
The summary above was generated by AI
Provectus, a leading AI consultancy and solutions provider specializing in Data Engineering and Machine Learning. With a focus on helping businesses unlock the power of their data, we leverage the latest technologies to build innovative data platforms that drive results. Our Data Engineering team consists of top-tier professionals who design, implement, and optimize scalable, data-driven architectures for clients across various industries.

We are seeking a talented and experienced Data Engineer to join our team at Provectus. As part of our diverse practices, including Data, Machine Learning, DevOps, Application Development, and QA, you will collaborate with a multidisciplinary team of data engineers, machine learning engineers, and application developers. 

Responsibilities:

  • Collaborate closely with clients to deeply understand their existing IT environments, applications, business requirements, and digital transformation goals.
  • Collect and manage large volumes of varied data sets.
  • Work directly with ML Engineers to create robust and resilient data pipelines that feed Data Products.
  • Define data models that integrate disparate data across the organization.
  • Design, implement, and maintain ETL/ELT data pipelines.
  • Perform data transformations using tools such as Spark, Trino, and AWS Athena to handle large volumes of data efficiently.
  • Develop, continuously test, and deploy Data API Products with Python and frameworks like Flask or FastAPI.

Requirements:

  • Experience handling real-time and batch data flow and data warehousing with tools and technologies like Airflow, Dagster, Kafka, Apache Druid, Spark, dbt, etc.
  • Experience in AWS.
  • Proficiency in programming languages relevant to data engineering, such as Python and SQL.
  • Proficiency with Infrastructure as Code (IaC) technologies like Terraform or AWS CloudFormation.
  • Experience in building scalable APIs.
  • Familiarity with Data Governance aspects like Quality, Discovery, Lineage, Security, Business Glossary, Modeling, Master Data, and Cost Optimization.
  • Upper-Intermediate or higher English skills.
  • Ability to take ownership, solve problems proactively, and collaborate effectively in dynamic settings.

Nice to Have:

  • Experience with Cloud Data Platforms (e.g., Snowflake, Databricks).
  • Experience in building Generative AI Applications (e.g., chatbots, RAG systems).
  • Relevant AWS, GCP, Azure, Databricks certifications.
  • Knowledge of BI Tools (Power BI, QuickSight, Looker, Tableau, etc.).
  • Experience in building Data Solutions in a Data Mesh architecture.

We offer:

  • Participate in internal training programs (Leadership, Public Speaking, etc.) with full support for AWS and other professional certifications.
  • Work with the latest AI tools, premium subscriptions, and the freedom to use them in your daily work.
  • Long-term B2B collaboration.
  • 100% remote — with flexible hours.
  • Collaboration with an international, cross-functional team.
  • Comprehensive private medical insurance or budget for your medical needs.
  • Paid sick leave, vacation, and public holidays.
  • Equipment and all the tech you need for comfortable, productive work.
  • Special gifts for weddings, childbirth, and other personal milestones.

Top Skills

Airflow
Apache Druid
AWS
Aws Athena
Aws Cloudformation
Bi Tools
Dagster
Databricks
Dbt
Kafka
Python
Snowflake
Spark
SQL
Terraform
Trino

Similar Jobs

17 Days Ago
Remote or Hybrid
Poland
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Develop and maintain data pipelines using Python and SQL, build integrations, manage data storage solutions, and ensure data quality.
Top Skills: Azure Devops ServicesDockerGitLinuxPostgresPythonSQL
2 Hours Ago
In-Office or Remote
2 Locations
Senior level
Senior level
Information Technology • Software
The Senior Data Engineer will develop real-time data pipelines, own services for analytics and detection, and build end-to-end solutions while ensuring high system availability.
Top Skills: Argo WfAWSFlinkGCPGitGoGrafanaJavaJenkinsKafkaKubernetesOciPrometheusProtobufPythonScalaSparkSQL
Yesterday
Easy Apply
Remote
Poland
Easy Apply
Senior level
Senior level
Payments
Design and build high-performance data pipelines, optimize data ecosystem, and lead ETL workflows using AWS technologies and modern data tools.
Top Skills: AirflowAws DmsAws RedshiftDagsterDbtEmrGlueJavaLambdaPrefectPythonS3ScalaSQL

What you need to know about the Ottawa Tech Scene

The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account