Tech Stack
AirflowAWSETLMatillionPythonSQL
About the role
- Influence the technical roadmap and architecture to enable S&S to use data more effectively
- Help select tools, technologies, and operational practices that allow the company to manage its data at scale
- Develop ETL/ELT pipelines to bring together data from a variety of sources into the company’s data warehouse
- Manage existing ETL/ELT pipelines, including monitoring, troubleshooting, and optimizing their performance
- Partner with business teams to understand their data needs and prioritize new development work
- Design and prototype solutions in Snowflake and AWS that help business teams answer complex questions
- Provide input on data management best practices to ensure the company’s data ecosystem is robust
- Collaborate with technical and non-technical counterparts to understand and prioritize projects
- Participate in design reviews and code reviews with other engineers
Requirements
- 7+ years of hands-on experience designing and managing a data warehouse, building ETL/ELT pipelines, and implementing DataOps processes
- Expertise in Snowflake; SnowPro certification preferred
- Experience using AWS infrastructure to process and move large volumes of data
- Strong SQL and Python skills
- Ability to write production-quality code; knowledge of Git development workflows
- Familiarity with platforms such as Matillion, FiveTran, or Airflow and associated design patterns for large-scale ETL/ELT systems
- Demonstrated ability to use APIs to extract data from vendor applications in a production setting
- Commitment to good engineering practices, including mentoring junior engineers