Tech Stack
AirflowPythonSQL
About the role
- Design and maintain data pipelines, ELT processes, and data warehouse
- Create robust ingestion pipelines with partners
- Build processes for optimal data extraction, transformation, and loading
- Maintain documentation of data architecture and processes
- Implement data models in the data warehouse
- Automate tasks with a robust data orchestration system
- Build integration components across platforms
- Ensure data integrity with alert setups
Requirements
- 5+ years working in Python
- Experience with ELT patterns (e.g., Airbyte, Fivetran)
- Comfortable with data orchestration systems (Airflow, Astronomer, Dagster)
- Expert in SQL and data modeling
- Experience with data modeling and transformation tools (e.g., dbt)
- Proficient in git-based version control
- Experience in architecture and design of data processing systems
- Collaborated with analysts, data scientists, and ML engineers
- Flexible work arrangements
- Professional development
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonELTSQLdata modelingdata transformationgitdata architecturedata processing systemsdata ingestiondata integrity
Soft skills
collaborationdocumentation