Tech Stack
AirflowApacheAWSDockerPandasPythonSQL
About the role
- Lead the end-to-end ownership and evolution of the iPaaS data platform
- Drive the intake and scoping of new data integrations
- Architect and evolve robust infrastructure with a strong DevOps mindset
- Operate and optimize the data lake on AWS S3
- Design and execute scalable data orchestration workflows using Apache Airflow
- Develop and maintain secure and efficient container images for data workloads
- Champion data modeling and transformation best practices with dbt
- Manage and optimize Snowflake environments
- Govern Snowflake resources effectively using Liquibase
- Enforce stringent security and access controls
- Implement and administer the data catalog
- Oversee production monitoring and troubleshooting.
Requirements
- 6+ years of hands-on experience in data engineering or platform roles
- Deep expertise with the modern data stack
- Proven ability to govern Snowflake environments
- Strong hands-on experience with Snowflake RBAC
- Advanced proficiency in Docker
- Experience with container orchestration platforms like AWS ECS
- Strong background in DevOps practices
- Experience with CI/CD for containers and data
- Expertise in Python, including libraries such as Pandas, SQL Alchemy, requests, Airflow, and boto3
- Strong SQL skills for optimizing transformations and storage layouts.
- Professional development opportunities with international customers
- Collaborative work environment
- Career path and mentorship programs that will lead to new levels.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data engineeringdata integrationsDevOpsdata lakeAWS S3data orchestrationApache Airflowcontainer imagesSnowflakeSQL
Soft skills
leadershipcommunicationproblem-solvingcollaborationgovernance