Tech Stack
AirflowBigQueryCloudGoogle Cloud PlatformJavaKafkaPythonSQL
About the role
- Migrate and modernize large-scale data workflows to GCP services like BigQuery, Dataflow, Dataproc, and Composer (Airflow)
- Reengineer SQL logic, automate workflows with Python, and optimize performance across the board
- Collaborate with architects and business leaders to ensure data quality, performance, and security
Requirements
- 4–6+ years of Data Engineering experience, with 2+ years in GCP.
- Strong background in Teradata, advanced SQL, and scripting in Python/Java.
- Solid experience with GCP-native tools, CI/CD, and DevOps in data environments.
- GCP Professional Data Engineer Certification
- Experience with Kafka, AI/ML on GCP, and the healthcare domain.
- Knowledge of data governance in cloud ecosystems.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data engineeringGCPSQLPythonJavaCI/CDDevOpsKafkaAI/MLdata governance
Soft skills
collaborationcommunicationproblem-solvingperformance optimizationdata quality assurance
Certifications
GCP Professional Data Engineer Certification