OpenRouter

Senior Data Engineer

OpenRouter

full-time

Posted on:

Location: 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $175,000 - $200,000 per year

Job Level

Senior

Tech Stack

ETLGoogle Cloud PlatformKafkaPostgresPythonSQLTerraform

About the role

  • Architect, build, and own end-to-end data platform and warehousing for OpenRouter's LLM marketplace.
  • Stand up central analytics store: define schemas, partitioning, retention, and performance strategies for scale; establish data contracts and documentation.
  • Build, operate, and monitor robust ETL/ELT pipelines with CDC and batch/stream ingestion; ensure idempotency, safe backfills, and predictable reprocessing.
  • Design and ship customer data products and metrics (latency, throughput, error rates, reliability/SLOs, cost) with handling for late/duplicate events.
  • Build secure, multi-tenant datasets and APIs; enforce isolation with row/column-level security, access controls, and privacy guardrails.
  • Take data features from concept to production: design, implement, backfill/migrate, document, and support.

Requirements

  • 4+ years as a Data Engineer (or similar), including owning production pipelines and a modern data warehouse (Clickhouse, Snowflake, Databricks, etc)
  • Expert SQL and Python with deep experience with ETL/ELT design, data modeling, and performance tuning.
  • Experience building end-to-end data infrastructure: storage, compute, networking, orchestration, CI/CD for data, monitoring/alerting, and cost management.
  • Excellent communicator who can self-manage, set expectations, and partner across functions while working asynchronously.
  • Customer-obsessed and product-minded, starting from the user problem and shipping pragmatically to deliver value fast.
  • Biased to action, comfortable with ambiguity, and able to prioritize for impact; default to simple, reliable solutions and iterate quickly.
  • Bonus: Experience as the first/early data hire building 0 to 1.
  • Bonus: Direct experience with Clickhouse, Postgres, GCP, or Terraform.
  • Bonus: Experience standing up event streaming platforms (e.g. Kafka, Pub/Sub).
  • Bonus: Experience with database privacy/compliance standards (e.g., SOC 2, GDPR/CCPA).