As a Senior Data Engineer, you will architect intelligent data ecosystems, build ETL pipelines, mentor team members, and optimize data processes using AI technologies.
We build the tech that moves industries forward. We have our eyes set on AI, energy, logistics, sports and other complex and exciting segments.
We believe in an innovative approach to solving deep issues and encourage our people to find their own solutions. We are constantly rethinking processes, business models, architecture, and tech stacks.
We foster a sense of curiosity, experimentation, and passion beyond code. With us, you can easily deepen your knowledge in any field you’re curious about. And because we work across many industries, you’ll be gaining the experience others can only dream of.
At the forefront of reimagining how industries operate, we are a team of builders and thinkers reshaping e-commerce, ticketing, and logistics from first principles. Our work is grounded in curiosity, experimentation, and a drive for real business impact. We eliminate inefficiencies—not just in code, but in legacy models and outdated assumptions. For those who seek to solve complex problems and mentor others in the process, this is a place to thrive.
We are looking for a Senior Data Engineer who combines deep technical expertise with a strategic mindset. In this role, you’ll architect and build intelligent data ecosystems that power autonomous workflows—integrating Generative and Agentic AI to help businesses move faster, think smarter, and operate more efficiently. Equal parts architect and builder, you’ll be instrumental in delivering high-impact, AI-powered solutions across diverse industries.
In this role, you will
- Analyze and optimize business processes by collaborating with stakeholders to uncover inefficiencies and define data requirements for automation
- Design scalable, modular data architectures that integrate with Generative AI and Agentic AI systems to support real-time decision-making
- Engineer robust ETL/ELT pipelines using Python, cloud-native services, and orchestration tools, supporting both batch and streaming data needs
- Architect RAG and vector database solutions using semantic search to enable LLMs to retrieve curated, context-rich business data
- Build intelligent data products, from predictive models and decision engines to AI-driven insights platforms
- Implement data quality, validation, and governance frameworks to ensure data integrity, lineage, and compliance across systems
- Lead technical discovery sessions with clients to transform complex business challenges into AI and data-driven opportunities
- Mentor team members on best practices in data engineering, AI integration, and modern cloud architectures
What you will bring
- Expert-level Python proficiency for data engineering, including API integrations, data transformations (Pandas, PySpark), and automation
- Proven experience designing and deploying large-scale data platforms on AWS, GCP, or Azure
- Strong foundation in building production-grade ETL/ELT pipelines using Apache Airflow, Kafka, Spark, or cloud-native tools
- Hands-on experience with vector databases (e.g., Pinecone, Weaviate, Chroma, Milvus) and implementing semantic search
- Demonstrated knowledge of Generative AI and LLMs, with practical experience in RAG architectures and prompt engineering
- Deep understanding of data governance, quality, and documentation, with a focus on lineage, metadata, and compliance
- Familiarity with cloud services including serverless computing, managed databases, and data warehouses such as BigQuery, Redshift, or Snowflake
- Experience working with complex real-world data environments, including legacy systems, SaaS integrations, APIs, and databases
- Fluency in English, both written and spoken
What we offer
- A working culture that is high performing, ambitious, collaborative and fun
- Health insurance and a yearly training budget (local and international conferences, language courses), employee-led workshops
- Flexible working hours
- Unlimited WFH (work from home) policy
- Bonus for referrals
- For those who dream of traveling: WFA (work from anywhere) possibilities in NFQ - approved countries
- B2B contracts include paid annual service break and paid public holidays in Poland
- Office perks and team activities
Salary range:
103 - 164 PLN/h + VAT (B2B)
14 270 - 22 200 PLN gross (Permanent)
If you have any questions, please contact me at [email protected] or via Linkedin.
Check all our career opportunities here.
Top Skills
Apache Airflow
AWS
Azure
BigQuery
Chroma
GCP
Kafka
Milvus
Pinecone
Python
Redshift
Snowflake
Spark
Weaviate
Similar Jobs
Big Data • Fintech • Mobile • Payments • Financial Services
The Senior Software Engineer will lead engineering teams, mentor junior engineers, and ensure system reliability and performance while developing backend services.
Top Skills:
AWSKotlinKubernetesMySQLPython
Software
The Senior Data Engineer will design and implement scalable data pipelines, collaborate with teams to ensure data quality, and mentor less experienced engineers, focusing on machine learning and performance optimization.
Top Skills:
Cloud-Native Data ArchitecturesGpu-Accelerated Data ProcessingHigh-Performance ComputingPython
Software
Seeking a Senior Data Engineer to develop scalable data pipelines, manage AWS data services, optimize ETL processes, and enhance data quality.
Top Skills:
AthenaAWSAws CloudformationDatabricksDmsGithub ActionsGlueJenkinsPysparkPythonRdsRedshiftS3SQLTerraform
What you need to know about the Ottawa Tech Scene
The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.


