Salary
💰 $148,845 - $188,089 per year
Tech Stack
AirflowAWSPythonSQL
About the role
- Work across teams to build pipelines, services, and tools that enable UA teammates and integrated systems with the data, information, and knowledge to fulfill Under Armour’s mission
- Design and build data products and data flows for the continued expansion of the data platform mission; deployed code is performant, well-styled, validated, and documented
- Implement data platform architectural designs/approaches
- Instrument and monitor the systems and products of the data platform
- Design, build, integrate, and maintain data ingestion pipelines from 3rd Party source systems and/or data providers
- Design, build, integrate, and maintain data ingestion pipelines from offline sources, to include various 3rd Party services
- Design and integrate data replication solutions from various UA enterprise or other back end services to the Data Platform
- Support data catalog initiatives and adoption across the Data Platform user community
- Support data cleansing and data quality initiatives across the Data Platform for its foundational data
- Support identity resolution initiatives across the Data Platform
- Lead one or two medium or large projects at a time
- Other data engineering duties as assigned
Requirements
- Bachelor's degree in Computer Science, Information Systems, or in closely related technical field plus 5 years of progressively responsible data and/or software engineering experience OR master's degree in Computer Science, Information Systems, or in closely related technical field plus 3 years of progressively responsible data and/or software engineering experience
- 2 years of experience with Data engineering fundamentals (design patterns, common practices)
- 2 years of experience building high volume data products and services
- 2 years of experience with Security and privacy 'by design’ frameworks
- 2 years of experience with Data science lifecycle
- 2 years of programming in SQL
- 2 years of programming in Python
- 2 years of experience with Snowflake
- Demonstrated knowledge of Job Orchestration Tools such as Airflow or similar tools
- Demonstrated knowledge of AWS data-related products such as EMR, Glue, S3, and/or Lambda
- Experience with data and/or software engineering with progressive responsibility