DLG (Digital Luxury Group) Logo

DLG (Digital Luxury Group)

Senior Data Engineer

Posted 6 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in Poland
Senior level
Remote
Hiring Remotely in Poland
Senior level
The Senior Data Engineer will design and maintain data pipelines, ensuring high data quality and performance. Responsibilities include ETL development, data transformation, and documentation.
The summary above was generated by AI

Senior Data Engineer

Remote (Poland or Romania)  |  Monthly travel to Geneva  |  Full-time  |  Employment via Employer of Record

DLG builds LuxuryIQ, a market intelligence platform used by 80+ luxury brands. The platform ingests data from 50+ sources — social platforms, search engines, secondary markets, advertising networks — through BigQuery into client-facing analytics and AI-powered tools.

The data infrastructure exists and runs in production. It needs an experienced engineer to take ownership, improve reliability, expand coverage, and enforce quality standards across the full pipeline.

What makes this different: your clean, documented data will power a conversational AI layer (via Model Context Protocol) that lets luxury brand CEOs query market intelligence in natural language. You’re not building another analytics pipeline — you’re building the foundation of an AI product.

SCOPE

As our most senior individual contributor on the data team, you will set the technical direction for pipeline architecture and data quality standards. You will take end-to-end ownership of mission-critical pipelines — from raw ingestion through to clean, documented delivery in BigQuery — and be the go-to technical authority on data behaviour across the platform.

Responsibilities

  • Design and develop data processing, cleansing, transformation, and QA scripts using Python and SQL
  • Own the operation, maintenance, and enhancement of existing mission-critical data pipelines and ETLs
  • Monitor pipeline reliability and data quality standards; identify and close coverage gaps systematically
  • Optimise ETL performance and migrate legacy pipelines into a more scalable architecture
  • Define and enforce data quality standards across the full pipeline — you treat data quality as non-negotiable
  • Contribute to architectural decisions on ingestion, transformation, and delivery layer design
  • Own the full data lifecycle documentation: every transformation rule, edge case, and business logic decision must be spec'd, not held in someone's memory
  • Fetch datasets from external data sources: REST APIs, JSON, CSV
  • Perform data migrations and batch data updates
  • Conduct exploratory analysis to support new business requirements and data source onboarding
  • Develop proof-of-concept pipelines for new data initiatives

REQUIREMENTS

Technical

  • Graduate in Computer Science, Information Systems, Statistics, Mathematics, or related field
  • 7+ years of relevant experience in data engineering, database or ETL development
  • Excellent Python skills (pandas, numpy); able to write production-grade pipeline code
  • Excellent SQL skills and strong database design and development experience
  • Proven ETL development and long-term maintenance experience
  • Apache Airflow experience
  • Entity resolution and deduplication across heterogeneous, multi-source datasets
  • Web scraping, crawler development, and API integration experience
  • Working knowledge of AI/LLM tooling (Claude Code, Copilot, or equivalent)
 

Preferred

  • BigQuery (primary stack) and/or GCP experience — strongly preferred
  • PostgreSQL experience
  • dbt experience
  • Experience with AliCloud, AWS, or similar cloud platforms
  • Big Data processing technologies: Databricks, Spark, Redis, Kafka
 

Behavioural

  • You take full ownership of your scope and drive it independently
  • You identify gaps in pipeline coverage and close them systematically
  • You document thoroughly — undocumented systems are unreliable systems
  • You treat data quality as non-negotiable — you've been burned by bad data before and it shows

IDEAL BACKGROUND  ●

You’ve worked with scraped, unstructured, messy real-world data at scale. You know what it’s like to reconcile the same entity across dozens of sources with different naming conventions, missing fields, and inconsistent formats.

 

Strong-fit industries: Price comparison and travel aggregation platforms, marketplace intelligence and web analytics, real estate platforms, alternative data providers for finance, or competitive intelligence companies.

 

The key differentiator: Have you built a data product that external customers pay for? That’s a different mindset than running internal data ops. Interest in the luxury industry is a plus but not required — the data engineering challenges are universal.

 


 

Top Skills

Apache Airflow
BigQuery
Databricks
Dbt
GCP
Kafka
Postgres
Python
Redis
Spark
SQL

Similar Jobs

7 Days Ago
Easy Apply
Remote
Easy Apply
Senior level
Senior level
Big Data • Fintech • Mobile • Payments • Financial Services
The Senior Software Engineer will lead engineering teams, mentor junior engineers, and ensure system reliability and performance while developing backend services.
Top Skills: AWSKotlinKubernetesMySQLPython
2 Days Ago
In-Office or Remote
Senior level
Senior level
Pharmaceutical
The Senior Data Engineer will own end-to-end data pipeline work, maintain production pipelines, integrate new data sources, and collaborate with data scientists.
Top Skills: ArgoAWSBigQueryDockerDynamoDBPostgresPythonSQLTerraform
2 Days Ago
Remote or Hybrid
Senior level
Senior level
Software
The Senior Data Engineer will design and implement scalable data pipelines, collaborate with teams to ensure data quality, and mentor less experienced engineers, focusing on machine learning and performance optimization.
Top Skills: Cloud-Native Data ArchitecturesGpu-Accelerated Data ProcessingHigh-Performance ComputingPython

What you need to know about the Ottawa Tech Scene

The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account