Salary
💰 $145,400 - $195,000 per year
Tech Stack
AirflowAmazon RedshiftDistributed SystemsETLPySparkPythonSparkSQL
About the role
- Partner with technical and non-technical colleagues to understand data and reporting requirements
- Work with engineering teams to collect required data from internal and external systems
- Design table structures and define ETL pipelines to build performant, reliable, and scalable data solutions
- Develop Data Quality checks
- Develop and maintain ETL routines using ETL and orchestration tools such as Airflow
- Implement database deployments using tools like Schema Change
- Perform ad hoc analysis as necessary
- Perform SQL and ETL tuning as necessary
Requirements
- 5+ years of relevant data engineering experience
- Strong understanding of data modeling principles including Dimensional modeling and data normalization
- Good understanding of SQL Engines and ability to conduct advanced performance tuning
- Ability to think strategically and analyze/interpret market and consumer information
- Strong communication skills – written and verbal presentations
- Excellent conceptual and analytical reasoning competencies
- Comfortable working in a fast-paced and highly collaborative environment
- Familiarity with Agile Scrum principles and ceremonies
- 2+ years implementing and reporting on business key performance indicators in data warehousing environments
- 2+ years using analytic SQL and working with traditional relational databases and/or distributed systems (Snowflake or Redshift)
- 1+ years programming (e.g., Python, PySpark) preferred
- 1+ years with data orchestration/ETL tools (Airflow, Nifi) preferred
- Experience with Snowflake, Databricks/EMR/Spark, and/or Airflow
- Experience with ETL and orchestration tools such as Airflow and database deployment tools like Schema Change
- Bachelor’s Degree in computer science, information systems, or related field or equivalent work experience