Tech Stack
AirflowAWSAzureCloudDistributed SystemsPythonScalaSparkSQL
About the role
- Build, scale, and optimize the data platform using Databricks, Spark, and various data integration tools
- Design and maintain robust data pipelines with Airflow, Fivetran, DBT, Segment
- Develop and maintain the infrastructure needed to support MLOps practices
- Elevate the team’s skills and knowledge by participating in technical designs
Requirements
- 4+ years of hands-on experience in Data Engineering
- Expertise in designing and building distributed systems
- Proficiency in Python, Scala, SQL, and Spark
- Solid understanding of cloud infrastructure (AWS/Azure)
- Experience with data governance, security practices, and compliance
- Excellent communication skills
- Health insurance
- Flexible work arrangements
- Professional development
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
Data EngineeringPythonScalaSQLSparkMLOpsData PipelinesData IntegrationData GovernanceData Security
Soft skills
CommunicationTechnical Design