Salary
💰 $85,000 - $170,000 per year
Tech Stack
AirflowApacheAWSAzureCloudETLGoogle Cloud PlatformLinuxPythonSparkSQLUnix
About the role
- Design, develop, and optimize robust data pipelines and ETL processes
- Build reusable libraries and shared capabilities to enable faster time to market by Data Engineering delivery teams
- Architect and implement scalable data solutions using the GCP cloud platform
- Continually improve platform capabilities and maintain overall platform health
- Collaborate with Technology, Product and Business stakeholders to understand data requirements and deliver solutions
- Ensure data quality, integrity, and security across all data systems
- Mentor and guide junior data engineers and contribute to best practices
- Monitor, troubleshoot, and improve data workflows and performance
- Provide automation capabilities to aid delivery teams with data validation, testing, and deployments
- Stay updated with emerging data engineering technologies and trends
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
- 7+ years of experience in data engineering or related roles
- Highly proficient in Python and SQL
- Well versed in the nuances of software engineering and systems design
- Hands-on experience with cloud data platforms (GCP preferred; experience with AWS or Azure acceptable)
- Hands on experience with Apache Airflow/Composer, Beam/Dataflow and Spark/Dataproc
- Experience working in Linux/Unix/*nix environments
- Strong knowledge of ETL tools and data modeling
- Ability to mentor and guide junior data engineers