LiveKit Logo

LiveKit

Data Engineer

Reposted Yesterday
Remote
Hiring Remotely in Canada
Senior level
Remote
Hiring Remotely in Canada
Senior level
As a Data Engineer at LiveKit, you will manage the analytics infrastructure, develop scalable GCP-based data pipelines, and ensure effective data movement and transformation processes, while collaborating with the Analytics and engineering teams.
The summary above was generated by AI

LiveKit is building the infrastructure layer for the voice-driven era of computing. Our platform gives developers everything they need to build, test, deploy, scale, and observe agents in production. Founded in 2021, LiveKit powers voice AI applications for OpenAI, xAI, Salesforce, Coursera, Spotify, and thousands of others, collectively facilitating billions of calls each year.

You'll thrive at LiveKit if you:
  • obsess with crafting code that is fast, reliable and practical for the problem

  • are known as the go-to person for tackling tough technical problems

  • work hard and can build and ship fast

  • can clearly explain complex technical concepts to others

  • are a fast learner, frequently picking up new languages and tools

The best way to impress us is with thoughtful Issues and/or PRs on our Github repos 😊

About This Role:

As a Data Engineer at LiveKit, you'll own the analytics infrastructure that powers our business intelligence and data analysis capabilities. Working closely with the Head of Data and analytics peers, you'll design and implement scalable GCP-based data pipelines — from ingestion through transformation to delivery — maximizing the GCP ecosystem for cost-effective solutions while integrating additional services or homegrown tooling where appropriate. While analytics infrastructure is the core focus, you'll also engage with the broader application data infrastructure, contributing your data pipeline expertise to support product and engineering needs. This is a foundational IC role with significant ownership over the architecture and direction of our analytics stack as the team grows.

What You’ll Do:

Own the Analytics Infrastructure: You are the end-to-end owner of our GCP-based data infrastructure — including ingestion, movement, storage, security, and availability. You build and operate reliable, scalable pipelines that power analytics, and partner closely with the Analytics team on downstream transformation and BI.

Maximize the Cloud Ecosystem: Build cost-effective solutions primarily within GCP-native services, while bringing transferable cloud infrastructure expertise. Know when to extend with third-party tooling or homegrown solutions, and make pragmatic tradeoffs.

Contribute Across Data Infrastructure: While analytics is the primary focus, you'll bring broad data pipeline expertise to application data needs in collaboration with the product engineering team.

Managed Services First: Favor managed solutions over self-hosting. Evaluate build vs. buy with cost and operational burden in mind.

Engineering Standards: This role reports to the Head of Data within the Engineering org. Expect PR reviews, automated testing, proper change management, and production-grade standards.

AI-First Development: Work extensively with AI coding assistants and contribute to evolving our AI development workflows and infrastructure.

Startup Pace: Priorities shift quickly. Balance long-term architectural thinking with the tactical execution the moment requires.

Who You Are:
  • 8+ years of experience in data engineering with strong Python and SQL expertise. You've built analytics data infrastructure from scratch — ideally more than once — and owned the architecture end-to-end

  • Experience with cloud-native data infrastructure (GCP preferred; strong AWS builders who can translate cloud concepts welcome). Familiarity with BigQuery, Dataflow, Cloud Storage, or equivalent services

  • Proven ability to design and implement production-grade data pipelines and aggregation layers for BI and analysis

  • AI-first development mindset with hands-on experience building AI-driven workflows and effectively using AI coding assistants

  • Strong understanding of data modeling, transformation patterns, and working with dbt

  • Experience with data movement tools (Estuary, Airbyte, Fivetran, or similar)

  • Solid infrastructure and DevOps fundamentals: Terraform or similar IaC, CI/CD, Git workflows, and change management

  • Experience implementing observability and monitoring for data systems (DataDog, Grafana, or similar)

  • Strong communication skills and ability to work cross-functionally with engineering and business stakeholders

  • Self-directed and comfortable with ambiguity in a fast-paced startup environment

  • Located in the US or Canada

Bonus
  • Experience coordinating with dbt and analytics engineering teams

  • Background with AI workflow tools (n8n or similar)

  • Background with AI coding assistants

  • Prior experience as an early infrastructure hire building from the ground up

  • Prior experience building on GCP/BigQuery in production

Our Commitment to You:
  • An opportunity to build something truly impactful to the world

  • Contribute to open source alongside world-class engineers

  • Competitive salary and equity package

  • Health, dental, and vision benefits

  • Flexible vacation policy

LiveKit is an equal opportunity employer and does not discriminate on the basis of any characteristic protected by applicable law. If you require a reasonable accommodation during the application or interview process, please contact [email protected].

Top Skills

Airbyte
BigQuery
Cloud Storage
Datadog
Dataflow
Dbt
Estuary
Fivetran
GCP
Grafana
Python
SQL
Terraform

Similar Jobs

4 Days Ago
Remote or Hybrid
Ontario, ON, CAN
Junior
Junior
AdTech • Consumer Web • Digital Media • eCommerce • Marketing Tech
The Data Engineer 2 will build data integration pipelines, maintain data quality, and collaborate with stakeholders to meet data deliverables.
Top Skills: AWSGCPPythonSQL
2 Days Ago
Easy Apply
Remote or Hybrid
Canada
Easy Apply
Senior level
Senior level
Marketing Tech • Real Estate • Software • PropTech • SEO
As a Sr. Data Engineer, you'll design and manage data pipelines, enhance data quality, and leverage AI to streamline operations, driving the success of our AI growth platform for real estate.
Top Skills: AirflowAWSIcebergKafkaKubernetesNode.jsPostgresPydanticPysparkPythonSpark StreamingSqsTypescript
9 Days Ago
Easy Apply
Remote or Hybrid
British Columbia, BC, CAN
Easy Apply
Senior level
Senior level
Big Data • Cloud • Software • Database
The Staff Engineer will lead the development of data migration tools, focusing on architecture and mentoring while collaborating on technical strategies to solve complex customer challenges.
Top Skills: DebeziumJavaKafkaReactSpring BootSQL

What you need to know about the Ottawa Tech Scene

The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account