
Senior Data Engineer
Sharecare
full-time
Posted on:
Location Type: Remote
Location: United States
Visit company websiteExplore more
Job Level
Tech Stack
About the role
- Design, build, and maintain scalable data pipelines using Python, Apache Airflow, and Apache Spark
- Analyze business and technical requirements and translate them into reliable, future-proof data solutions
- Develop, validate, deploy, and support complex ETL/ELT pipelines at scale
- Build clean, secure, and maintainable REST APIs following company standards
- Implement real-time and batch processing solutions for diverse data sources
- Develop reusable data engineering and AI frameworks for enterprise-wide adoption
- Define and manage domain-based “source of truth” data models, ensuring scalability and end-to-end data lineage
- Implement data governance practices, automate data quality checks, and enable pipeline testing
- Optimize data infrastructure for performance, cost efficiency, and reliability
- Manage source control, CI/CD pipelines, and production deployments
- Partner with data scientists and analysts to support AI/ML initiatives
Requirements
- Bachelor’s degree (or higher) in Computer Science, Data Engineering, or a related field
- 10+ years of experience in data engineering or related roles
- Strong proficiency in Python and related libraries (Pandas, SQLAlchemy, Boto3, Paramiko, Flask, FastAPI)
- Advanced SQL skills with experience analyzing healthcare datasets (e.g., claims, provider directories)
- Hands-on experience with orchestration tools such as Apache Airflow and Databricks Workflows
- Strong experience with Apache Spark; exposure to Apache Flink is a plus
- Experience integrating with .NET applications and working with SQL Server
- Cloud platform experience (AWS, Azure, or GCP required)
- Experience with data warehousing solutions such as Amazon Redshift or Vertica
- Solid understanding of distributed systems and functional programming concepts
- Proficiency with Git and modern CI/CD practices
- Exposure to streaming technologies, containerization, and ML pipelines preferred
- Familiarity with AI tools and large language models is a plus
Benefits
- Equal Opportunity Employer
- E-Verify user
- Work-life balance
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonApache AirflowApache SparkETLELTREST APIsSQLdata engineeringdata governancedata quality
Soft Skills
analytical skillsproblem-solvingcollaborationcommunicationproject managementattention to detailadaptabilitycreativityleadershiporganizational skills
Certifications
Bachelor's degree in Computer ScienceBachelor's degree in Data Engineeringrelated field degree