
Senior Data Engineer
Hinge Health
full-time
Posted on:
Location Type: Hybrid
Location: San Francisco • California • United States
Visit company websiteExplore more
Salary
💰 $164,800 - $247,200 per year
Job Level
Tech Stack
About the role
- Design and build the data foundation for AI-powered health experiences and decision sciences - Build and own data pipelines across both streaming and batch paradigms, from ingestion to serving.
- Keep the platform reliable and the data trustworthy - Define and track SLAs (Service Level Agreements) and SLOs (Service Level Objectives) for the pipelines you own, and hold yourself accountable to them.
- Deliver with ownership and grit - Take projects from ambiguous requirements to production, working through technical blockers, cross-functional dependencies, and competing priorities without losing momentum.
- Make it easy for teams to own their data - Build tooling for service and application teams that helps them effectively manage their data and data processes.
- Build with compliance and trust at the center - Implement data handling practices that meet HIPAA, GDPR, and CCPA requirements including PII (Personally Identifiable Information) handling, access controls, data retention, and making production data safely available in non-production environments.
- Raise the bar for the team around you - Participate in hiring and mentor junior engineers.
Requirements
- Bachelor’s Degree in Computer Science or related technical degree
- 5+ years of data engineering experience with a proven track record of building and owning reliable production pipelines
- 5+ years of strong proficiency in Python and SQL
- 3+ years of experience in processing and storing large scale data using distributed systems as well as a mastery of database designs and data modeling including star and snowflake schemas.
- 2+ years of experience working with broad spectrum of data stores like PostgreSQL, MySQL, MongoDB, Redis, databricks and Redshift
- 3+ years of experience deploying and operating pipelines in the cloud including CI/CD, monitoring, and incident response
- 3+ years of experience building streaming and batch pipelines using tools such as Spark, Kafka, Flink, and Airflow
- 1+ year experience working with dbt
- 1+ years of experience with AI tools for code generation (cursor, claude code)
- 1+ years of experience with orchestration tools (Airflow, Prefect, or Dagster) including scheduling, retries, alerting, and SLAs.
Benefits
- Inclusive healthcare and benefits: On top of comprehensive medical, dental, and vision coverage, we offer employees and their family members help with gender-affirming care, tools for family and fertility planning, and travel reimbursements if healthcare isn’t available where you live.
- Planning for the future: Start saving for the future with our traditional or Roth 401k retirement plan options which include a 2% company match.
- Modern life stipends: Manage your own learning and development.
- Grow with us through discounted company stock through our ESPP with easy payroll deductions.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data engineeringPythonSQLdata modelingdatabase designstreaming pipelinesbatch pipelinesCI/CDmonitoringincident response
Soft Skills
ownershipgritmentoringcross-functional collaborationproblem-solvingaccountabilitycommunicationproject managementadaptabilityteamwork
Certifications
Bachelor’s Degree in Computer Science