Koala Health

Data Engineer

Koala Health

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $125,000 - $150,000 per year

About the role

  • Own and evolve Koala Health’s end-to-end data infrastructure, including ingestion, transformation, modeling, and delivery.
  • Design and maintain reliable data pipelines from production systems (e.g., application databases, third-party tools, vendors).
  • Build and manage data models that support analytics, reporting, and operational use cases.
  • Establish and enforce best practices for data quality, testing, monitoring, and documentation.
  • Partner with stakeholders across product, operations, finance, and marketing to understand data needs and translate them into scalable solutions.
  • Improve the reliability, performance, and cost-efficiency of the data stack as the business grows.
  • Own incident response and debugging for data issues, proactively identifying and resolving root causes.
  • Create and maintain clear documentation so data assets are understandable and usable across the company.
  • Evaluate and implement tooling improvements where it meaningfully improves developer velocity or data quality.
  • Act as a thought partner to leadership on how data can better support decision-making and operational efficiency.

Requirements

  • 4+ years of experience as a Data Engineer, Analytics Engineer, or in a similar role with significant end-to-end ownership.
  • Strong SQL skills and experience building well-structured analytical data models.
  • Hands-on experience working with cloud data warehouses, including Snowflake.
  • Experience designing and operating ELT pipelines using managed tools such as Fivetran (or comparable ELT solutions).
  • Familiarity with modern data platform orchestration and transformation frameworks (e.g., Mozart or similar).
  • Experience enabling analytics and reporting workflows using Looker with an understanding of how data modeling impacts downstream usage.
  • Solid programming experience (Python) for data pipelines and transformations.
  • Experience working with cloud infrastructure (AWS, GCP, or similar).
  • Ability to independently diagnose data quality issues, identify root causes, and implement durable fixes.
  • Proven ability to work autonomously, manage priorities, and deliver high-quality outcomes in a fast-moving environment.
  • Strong systems thinking and an instinct for building simple, reliable solutions.
  • Excellent communication skills and comfort collaborating with non-technical stakeholders.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
SQLdata modelingELT pipelinesFivetranSnowflakePythondata transformationdata qualitydata ingestiondata delivery
Soft Skills
communicationautonomyprioritizationsystems thinkingcollaborationproblem-solvingdocumentationstakeholder engagementanalytical thinkingoperational efficiency