Salary
💰 $91,300 - $228,200 per year
Tech Stack
AirflowAmazon RedshiftApacheAWSAzureBigQueryCloudDistributed SystemsETLGoogle Cloud PlatformJavaKafkaMicroservicesPythonSparkSQLTerraform
About the role
- Design and develop scalable ETL/ELT solutions and data architectures (data lakes, data warehouses)
- Optimize Snowflake queries and implement data governance best practices
- Collaborate with stakeholders to translate business needs into technical solutions
- Build APIs and integrate with email service providers and HCP data sources
- Lead development efforts and coordinate with offshore teams
- Conduct code reviews and ensure adherence to coding standards
- Support CI/CD processes using tools like Terraform and Airflow
- Maintain high standards of data quality and integrity across healthcare datasets
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
- 7+ years of experience in data engineering or similar role
- Strong proficiency in SQL and query optimization
- 5+ years of experience with Snowflake and cloud platforms (AWS preferred; Azure/GCP a plus)
- 3+ years of experience in Python or Java
- Experience with data pipeline tools (e.g., Apache Airflow, dbt, Kafka, Spark)
- Solid understanding of data modeling, warehousing, and distributed systems
- Familiarity with CI/CD tools and infrastructure-as-code (e.g., Terraform)
- Preferred: Experience with modern analytics databases (e.g., Redshift, BigQuery)
- Exposure to microservices architecture
- Prior involvement in architecture design or leading development projects