Design, build, and maintain a robust multi-stage data warehouse infrastructure, with a focus on either RedShift or Snowflake, ensuring scalability, performance, and data integrity
Manage efficient ELT pipelines to ingest, transform, and load data from various sources into the data warehouse
Implement and optimize dbt for data transformation, modeling, and documentation, ensuring data accuracy and consistency
Implement comprehensive data testing procedures, including unit tests, integration tests, and validation checks, to ensure data accuracy and reliability
Work closely with cross-functional teams, including data scientists, analysts, and product managers, to understand data requirements and provide actionable insights
Stay up-to-date with emerging data engineering technologies and best practices to drive continuous improvement and innovation within the team
Requirements
3-5 years experience in data engineering
3+ years experience leading a team, preferably with data teams
Strong expertise in RedShift or Snowflake data warehousing technologies
Proficiency in dbt for data transformation and modeling
Experience with data governance, data quality, and compliance
Knowledge of data testing methodologies and tools
Strong programming skills in languages such as SQL, Python
Excellent communication and collaboration skills
Bachelor's degree in Computer Science, Data Engineering, or a related field (nice-to-have)
Master's degree preferred (nice-to-have)
Experience in migrating a DWH from one tech stack to another (nice-to-have)