Mentor and guide mid-level data engineers, fostering their technical and professional growth through code reviews and 1:1 guidance.
Lead the technical design and execution of complex data projects, ensuring alignment with architectural best practices and business objectives.
Partner with stakeholders across Analytics, Growth, and Engineering to define the team's roadmap, gather requirements, and prioritize data engineering initiatives.
Manage the team's backlog and workload, ensuring timely delivery of high-quality data pipelines and products.
Design, build, and maintain scalable ETL/ELT pipelines using Airflow and Meltano.
Develop and optimize dbt models following our established data warehouse architecture.
Implement data quality monitoring, testing, and alerting across all pipelines.
Monitor pipeline performance and optimize for cost and efficiency.
Requirements
5+ years of production experience with Python and SQL.
Proven experience mentoring other engineers and leading complex data projects from inception to completion.
Deep expertise in dbt for data modeling and transformation.
Extensive experience designing, deploying, and managing complex workflows in Apache Airflow.
Proficiency with cloud data warehouses (BigQuery preferred, but Snowflake/Redshift acceptable).
Experience with infrastructure as code (Terraform, Pulumi, or similar).
Strong understanding of data warehouse design patterns, and a demonstrated ability to make strategic architectural decisions.
Excellent communication skills and experience collaborating with cross-functional stakeholders.
Experience with Git, CI/CD pipelines, and collaborative development workflows.
Benefits
Health Benefits
ESOP
Tech Allowance
Annual Off-Sites
Flexible Work
Professional Development
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonSQLdbtApache AirflowETLELTdata modelingdata transformationinfrastructure as codecloud data warehouses