Salary
💰 $80,400 - $134,000 per year
Tech Stack
CloudETLPySparkPythonSQL
About the role
- Design, build, and maintain robust data pipelines using PySpark, SQL, and Python within the Databricks ecosystem.
- Collaborate with cross-functional teams to understand data requirements and translate them into scalable engineering solutions.
- Own the development and deployment of ETL (extract, transform and load) processes that support enterprise data products and reporting needs.
- Contribute to the design and implementation of data models and architecture in Snowflake and other cloud platforms.
- Participate in code reviews, testing, and documentation to ensure high-quality, maintainable solutions.
- Monitor and optimize pipeline performance, ensuring reliability and scalability across large datasets.
- Support data governance and lineage efforts by implementing best practices in version control (Git) and metadata management.
- Research and propose new tools, frameworks, or approaches to improve data engineering workflows.
- Take initiative in identifying opportunities for automation, efficiency, and innovation within the data engineering space.
Requirements
- Bachelor of Science in Computer Science, Information Technology, Data Science, or a related field, preferred
- 2 plus years of experience in data engineering or a related technical role
- Solid understanding of data engineering principles and cloud-based data platforms
- Hands-on experience with Databricks, PySpark, SQL in a production environment
- Experience with Git for version control and collaborative development
- Ability to communicate effectively with technical and non-technical stakeholders
- Interest in healthcare data and a desire to make a meaningful impact through data-driven solutions.
- medical, vision, dental, and well-being and behavioral health programs
- 401(k) with company match
- company paid life insurance
- tuition reimbursement
- a minimum of 18 days of paid time off per year
- paid holidays
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PySparkSQLPythonETLdata modelsdata architecturedata governancemetadata managementautomationdata engineering principles
Soft skills
collaborationcommunicationinitiativeproblem-solvingattention to detail
Certifications
Bachelor of Science in Computer ScienceBachelor of Science in Information TechnologyBachelor of Science in Data Science