Interwell Health

Staff Data Engineer

Interwell Health

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Design and evolve a scalable, secure, cloud-native lakehouse platform leveraging Databricks, Microsoft Fabric (OneLake, Lakehouse, Data Factory), and dbt.
  • Define modeling patterns, governance frameworks, and engineering best practices across the data lifecycle.
  • Lead design reviews and guide teams in adopting scalable architectural patterns.
  • Drive long-term platform strategy and evaluate emerging technologies.
  • Design and implement batch and streaming data pipelines for healthcare data sources (EHR, claims, HL7/FHIR, APIs, flat files, databases).
  • Develop modular ingestion, quality, lineage, metadata, and observability frameworks that scale across domains.
  • Produce clean, analytics-ready datasets and data models for BI, analytics, and machine learning workloads.
  • Implement HIPAA-aligned access patterns and secure handling of PHI.
  • Architect Databricks workloads (clusters, jobs, Unity Catalog, Delta Lake) for reliability, performance, and cost efficiency.
  • Integrate Databricks and Microsoft Fabric with Azure services and enterprise systems.
  • Partner with product managers, data scientists, analysts, clinicians, and business stakeholders to translate healthcare data needs into scalable solutions.
  • Lead cross-functional initiatives that modernize and unify the organization’s data ecosystem.
  • Mentor senior and mid-level engineers; elevate team capability through technical coaching and standards.
  • Drive roadmap planning, platform evolution, and long-term data strategy.
  • Champion engineering excellence, reliability practices, documentation quality, and governance.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 7+ years of experience in data engineering.
  • 2+ years operating in a senior or staff level engineering role.
  • Deep hands-on proficiency with Databricks, Spark, Delta Lake, dbt, and Python.
  • Proven ability to design and operate largescale cloud data platforms (Azure preferred).
  • Hands-on experience with Data Engineering, Data Factory, Lakehouse, OneLake.
  • Advanced data platform architecture and Lakehouse design expertise.
  • Strong command of distributed data processing and cloud native engineering.
  • Experience working in HIPAA regulated environments and handling PHI.
  • Healthcare data fluency, including regulated data handling and compliance.
  • Technical leadership, mentorship, and influence across teams.
  • Strong communication skills with both technical and clinical stakeholders.
  • Experience with platform reliability, CI/CD for data pipelines, and infrastructure as code.
Benefits
  • Health insurance
  • Flexible working hours
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
DatabricksSparkDelta LakedbtPythonData EngineeringData FactoryLakehousecloud data platformsCI/CD
Soft Skills
technical leadershipmentorshipinfluencestrong communication