LiveKit

Data Engineer

LiveKit

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $195,000 - $245,000 per year

Job Level

About the role

  • Own the Analytics Infrastructure: You are the end-to-end owner of our GCP-based data infrastructure — including ingestion, movement, storage, security, and availability. You build and operate reliable, scalable pipelines that power analytics, and partner closely with the Analytics team on downstream transformation and BI.
  • Maximize the GCP Ecosystem: Build cost-effective solutions anchored in GCP-native services. Know when to extend with third-party tooling or homegrown solutions, and make pragmatic tradeoffs.
  • Contribute Across Data Infrastructure: While analytics is the primary focus, you'll bring broad data pipeline expertise to application data needs in collaboration with the product engineering team.
  • Managed Services First: Favor managed solutions over self-hosting. Evaluate build vs. buy with cost and operational burden in mind.
  • Engineering Standards: This role reports to the Head of Data within the Engineering org. Expect PR reviews, automated testing, proper change management, and production-grade standards.
  • AI-First Development: Work extensively with AI coding assistants and contribute to evolving our AI development workflows and infrastructure.
  • Startup Pace: Priorities shift quickly. Balance long-term architectural thinking with the tactical execution the moment requires.

Requirements

  • 8+ years of experience in data engineering with strong Python and SQL expertise
  • Deep expertise in GCP, with hands-on experience in BigQuery, Dataflow, Cloud Storage, and related analytics services
  • Proven ability to design and implement production-grade data pipelines and aggregation layers for BI and analysis
  • AI-first development mindset with hands-on experience building AI-driven workflows and effectively using AI coding assistants
  • Strong understanding of data modeling, transformation patterns, and working with dbt
  • Experience with data movement tools (Estuary, Airbyte, Fivetran, or similar)
  • Solid infrastructure and DevOps fundamentals: Terraform or similar IaC, CI/CD, Git workflows, and change management
  • Experience implementing observability and monitoring for data systems (DataDog, Grafana, or similar)
  • Strong communication skills and ability to work cross-functionally with engineering and business stakeholders
  • Self-directed and comfortable with ambiguity in a fast-paced startup environment
  • Located in the US or Canada.
Benefits
  • Competitive salary and equity package
  • Health, dental, and vision benefits
  • Flexible vacation policy
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
PythonSQLGCPBigQueryDataflowCloud Storagedata modelingtransformation patternsdbtdata pipelines
Soft Skills
strong communication skillsself-directedcomfortable with ambiguitycross-functional collaborationtactical executionlong-term architectural thinking