Superlanet

Senior Databricks Data Engineer – Data Architect, Provider, Patient & Enterprise Data

Superlanet

contract

Posted on:

Location Type: Remote

Location: Remote • California • 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $70 - $85 per hour

Job Level

Senior

Tech Stack

CloudPySparkSparkSQL

About the role

  • Design and build Databricks pipelines for provider, patient, and enterprise data
  • Support migration of existing Health Catalyst workloads
  • Bridge Snowflake and Databricks environments
  • Work closely with clinical, operational, and analytics stakeholders
  • Ensure data quality, lineage, and performance
  • Participate in backlog prioritization and execution

Requirements

  • 5+ years of data engineering or data architecture experience
  • Proficiency with Databricks, Spark, and cloud-based data engineering platforms
  • Strong SQL and data modeling experience
  • Hands-on, production experience building and supporting Databricks pipelines (PySpark / Spark)
  • Demonstrated ability to support data integration and reporting in enterprise environments
  • Experience bridging Snowflake and Databricks environments
  • Healthcare industry experience
  • Strong communication skills with the ability to interact directly with business and technical stakeholders.
Benefits
  • Based on experience

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
DatabricksSparkSQLdata modelingPySparkdata integrationdata qualitydata lineagedata performancecloud-based data engineering
Soft skills
communicationstakeholder interactioncollaborationbacklog prioritization