Lighthouse Logo

Lighthouse

Senior Data Engineer

Reposted 3 Days Ago
Be an Early Applicant
Easy Apply
Remote
Hiring Remotely in Metropolitan Area Apt, ON
Senior level
Easy Apply
Remote
Hiring Remotely in Metropolitan Area Apt, ON
Senior level
As a Senior Data Engineer, you'll maintain data integrations, resolve technical issues, communicate with stakeholders, and improve support processes.
The summary above was generated by AI

At Lighthouse, we’re on a mission to disrupt commercial strategy for the hospitality industry. Our innovative commercial platform takes the complexity out of data, empowering businesses with actionable insights, advanced pricing tools, and cutting-edge business intelligence to unlock their full revenue potential.

Backed by $370 million in series C funding and driven by an unwavering passion for growth, we’ve welcomed five companies into our journey and have surpassed $100 million in ARR in 2024. Our 850+ teammates span 35 countries and represent 34 nationalities.

At Lighthouse, we’re more than just a workplace – we’re a community. Collaborative, fun, and deeply committed, we work hard together to revolutionize the hospitality sector. Are you ready to join us and shine brighter in the industry’s most exciting rocket-ship? 🚀

What you will do

As a Senior Data Engineer, you will be a key player in the end-to-end lifecycle of our large-scale ingestion and transformation engine. You will be part of the full engineering loop: designing, developing, scaling and maintaining our data pipelines handling large data streams - to the order of 100 TB per day. This role combines advanced greenfield data engineering, diving into the complex business of hospitality data domains, and improving, maintaining and supporting the stability and quality of existing pipelines.

Where you will have impact

  • End-to-End Pipeline Engineering: Design, build, and deploy scalable data pipelines that ingest data from a polyglot of sources (APIs, Webhooks, SFTP, Cloud Storage).
  • Design and implement complex domain modeling and transformation logic to ensure disparate data sources are unified into high-fidelity data products.
  • Architectural improvements: the transition of legacy processes and components into modern, scalable and stable data architectures
  • Operational Excellence: improve the reliability and quality of our data pipelines by building and improving advanced observability, automated testing, automation and self-healing features to proactively address root causes.
  • Collaborate closely with product, other engineering teams, support and business teams, customers and partners.
  • Lead the investigation and resolution of complex technical issues, performing deep-dive analysis into our code and systems to identify the root cause of issues in your domain.
  • Drive engineering velocity by smartly applying AI to speed up development and leveraging AI and automation to streamline complex debugging, root-cause investigations, and recurring tasks.

About our team

You will join one of the Integration Engineering teams as (one of) the most senior team members. Our mission is to reliably transform, integrate, and store the numerous data sources that power our BI products, from web-scraped data to direct API integrations. We handle a staggering amount of information—our main dataset holds over 3 trillion hotel rates, and we process over 100TB of data daily. You'll join a highly talented and collaborative group of around 6-8 data engineers.

What's in it for you?

  • Flexible time off: Autonomy to manage your work-life balance.
  • Alan Flex benefits: 160€/month for food or nursery.
  • Flexible retribution: Optional benefits through tax-free payroll deductions for food, transportation and/or nursery. 
  • Wellbeing support: Subsidized ClassPass subscription.
  • Comprehensive health insurance: 100% Alan coverage for you, your spouse, and dependents.
  • Impactful work: Shape products relied on by 85,000+ users worldwide.
  • Referral bonuses: Earn rewards for bringing in new talent.

Who you are

  • You have experience architecting, designing, implementing, testing, deploying and maintaining complex data architectures and data pipelines.
  • You take pride in the software you build, from the initial design and implementation to the monitoring and continuous improvement of the system in production.
  • Professional proficiency in Python for large-scale data processing and pipeline development.
  • Strong knowledge of data cloud technologies, data streaming systems (like Kafka or Google Cloud Pub/Sub), and cloud database solutions (like BigQuery, Snowflake, or Databricks), and hands-on experience with a major cloud platform such as GCP, AWS, or Azure
  • A forward-thinking builder who uses AI-assisted tools to ship code with higher quality and faster, and who is eager to apply AI and automation to solve scale challenges in data engineering and day to day operations.
  • Fluency in English, with excellent communication and stakeholder management skills. You can navigate deep technical discussions with your fellow engineers and explain complex data flows to non-technical stakeholders with ease.

Technologies you will work with

Python, Google Cloud Platform (GCP), BigQuery, Spanner, Pub/Sub, Kubernetes.

#LI-Remote

Thank you for considering a career with Lighthouse. We are committed to fostering a diverse and inclusive workplace that values equal opportunity for all. We welcome candidates from all backgrounds, regardless of age, gender, race, religion, sexual orientation, and disability. We actively encourage applications from individuals with disabilities and are dedicated to providing reasonable accommodations throughout the recruitment process and during employment to ensure all qualified candidates can participate fully. Our commitment to equality is not just a policy; it's part of our culture.

If you share our passion for innovation and teamwork, we invite you to join us in shaping the future of the hospitality industry. At Lighthouse, our guiding light is to be an equal opportunity employer, and we encourage individuals from all walks of life to apply. Not ticking every box? No problem! We value diverse backgrounds and unique skill sets. If your experience looks a little different from what we've described, but you're passionate about what we do and are a quick learner, we'd love to hear from you.

We value the unique perspective and talents that you bring, and we're excited to see how your light can shine within our team. We can't wait to meet you and explore how we can grow and succeed together, illuminating the path towards a brighter future for the industry. #LI-Hybrid

Top Skills

BigQuery
Bigtable
Dataflow
Google Cloud Pub/Sub
Kubernetes
Python

Similar Jobs

19 Minutes Ago
In-Office or Remote
Toronto, ON, CAN
Senior level
Senior level
Artificial Intelligence • Big Data • Machine Learning
Lead design and implementation of a next-generation data layer for agentic AI: architect hybrid Snowflake/Kinetica/NoSQL environments, define multi-tenant schemas and knowledge graphs, oversee DBA/governance and performance engineering, establish ETL/ELT (CDC/streaming/batch) patterns, and serve as the primary technical client liaison.
Top Skills: Snowflake,Kinetica,Nosql,Knowledge Graph,Property Graph,Rdf,Dbt,Cube,Change Data Capture,Streaming,Batch,Etl,Elt,Vector Database,Olap,Api-First,Function Calling,Rbac,Row-Level Security,Partitioning,Indexing,Vacuuming,Resource Scaling,Llm
3 Days Ago
Remote
Canada
Senior level
Senior level
Information Technology • Software • Consulting
Senior Data Engineer responsible for building and productionizing robust data pipelines, analyzing large datasets, integrating multiple data sources, collaborating with cross-functional teams, and providing technical leadership for ML/LLM evaluation and data-driven product insights.
Top Skills: Python,Snowflake,Tableau,Machine Learning,Apis,Ai/Llm Evaluation Frameworks,Healthbench,Label Studio,Llms
3 Days Ago
In-Office or Remote
7 Locations
Senior level
Senior level
Software
Senior Data Engineer responsible for enterprise data modeling, building and maintaining AWS-based event-driven pipelines, optimizing large-scale Snowflake and Aurora Postgres systems, developing Python data pipelines, and collaborating on infrastructure and CI/CD to ensure scalable, production-grade data platforms aligned to ET time zone.
Top Skills: Snowflake,Aurora Postgres,Aws Glue,Aws Lambda,Aws Dms,Aws Eventbridge,Python,Sql,Github Actions,Terraform

What you need to know about the Ottawa Tech Scene

The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account