
Lead Data Engineer
OneOncology
full-time
Posted on:
Location Type: Hybrid
Location: Austin • Tennessee • Texas • United States
Visit company websiteExplore more
Job Level
About the role
- Lead, mentor, and develop engineers across both the Data Operations and Analytics Engineering teams
- Contribute to a culture of clear goals, open feedback, and continuous growth across the team.
- Promote collaboration and accountability while driving high standards for quality, efficiency, and continuous improvement.
- Lead incident management and resolution for pipeline failures, cluster issues, and data quality problems, driving thorough root cause analysis and preventative improvements
- Define and enforce operational standards, runbooks, and on-call practices for the team
- Manage and maintain Databricks Workflows and job orchestration, ensuring SLAs are consistently met
- Oversee the design, development, and maintenance of data models and transformations that serve business intelligence and analytics use cases
- Define and enforce analytics engineering best practices including modular transformation patterns, data testing, and code review standards
- Partner with data analysts and business stakeholders to understand modeling requirements and ensure data is accurate, accessible, and well-understood
- Additional responsibilities as assigned to help drive our mission of improving the lives of everyone living with cancer.
Requirements
- 8+ years of hands-on experience with SQL development
- 8+ years of experience working with relational and non-relational databases, with a strong foundation in data modeling, schema design, and query optimization.
- 5+ years of professional experience developing scalable solutions using Python or a similar OO language
- Proficient in Databricks, Spark, and Delta tables; experience with large-scale distributed data processing preferred
- Hands-on experience operating and monitoring data pipelines at scale in a production environment
- Solid understanding of the Lakehouse and Medallion architectures
- Experience with Azure data services (ADLS Gen2, Azure Data Factory, Event Hubs, or equivalent)
- Familiarity with Unity Catalog for data governance, access control, and data lineage
- Proven experience with designing Data Integration/ETL pipelines, using such tools as Azure Data Factory or equivalent.
- Excellent communication skills with the ability to convey technical concepts and operational status to both technical and non-technical stakeholders.
Benefits
- Health insurance
- Flexible work arrangements
- Professional development opportunities
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
SQLdata modelingschema designquery optimizationPythonDatabricksSparkDelta tablesdata integrationETL
Soft Skills
leadershipmentoringcollaborationaccountabilitycommunicationproblem-solvingcontinuous improvementfeedbackgoal settingroot cause analysis