Salary
💰 $95,000 - $160,000 per year
Tech Stack
AirflowAWSCloudETLPythonSQLTerraform
About the role
- Lead the rebuild of our data stack with a modern Snowflake data lakehouse, architected for scale and performance on AWS
- Design and implement best-in-class AWS data infrastructure using Terraform for provisioning, configuration, and automation
- Influence data architecture, tooling choices, and long-term strategy, ensuring alignment with business and technology needs and growth plans
- Build and optimize scalable ETL/ELT pipelines with AWS services, Python, and Airflow
- Establish and enforce rigorous standards for data quality, observability, and governance, including access control, lineage, and compliance requirements
- Prepare and evolve the data platform to support advanced analytics, AI, and machine learning use cases
- Collaborate closely with Product, Engineering, and Customer Success to deliver reliable, trusted data for analytics and reporting
Requirements
- Hands-on Snowflake experience in production environments
- Proven experience designing and maintaining large-scale data pipelines
- Strong SQL and Python skills for data transformation and orchestration
- Experience with ETL/ELT tools like Airflow, dbt, or similar
- Familiarity with AWS cloud infrastructure
- Think: S3, CloudTrail, Lambda, Step Functions, EventBridge, and Glue
- Deep understanding of data modeling, performance optimization, and query tuning
- Experience designing data workflows, ensuring data quality, reliability, and performance