
Director, Data Engineering – Automation
Empower
full-time
Posted on:
Location Type: Remote
Location: United States
Visit company websiteExplore more
Salary
💰 $138,000 - $200,100 per year
Job Level
About the role
- Lead a team of data engineers transforming data from disparate systems to enable insights and analytics for business stakeholders.
- Create technical roadmaps and recommend strategies for data pipelines and integration.
- Leverage cloud-based infrastructure to implement scalable, resilient, and efficient data engineering solutions.
- Collaborate with data analysts, data scientists, database administrators, cross-functional teams, and business stakeholders to solve problems.
- Influence architectural decisions and design patterns across the data platform.
- Provide technical leadership across the software development lifecycle, from design to deployment, including hands-on contribution.
- Develop project plans, facilitate prioritization timelines, allocate resources, and take ownership of assigned technical projects in a fast-paced environment.
- Perform code reviews and ensure data engineers follow best-practice coding standards.
- Define and validate test cases to ensure data quality, reliability, and a high level of confidence.
- Continuously improve quality, efficiency, and scalability of data pipelines, reducing gaps and inconsistencies.
Requirements
- Bachelor of Science in Computer Science or equivalent
- 7+ years of post-degree professional experience
- 4+ years building and maintaining ETL pipelines in a data warehouse environment
- 5+ years of Python development experience
- Experience hiring and leading a team of 3+ data engineers, including supervision, goal-setting, and supporting professional growth
- Strong communication and interpersonal skills to initiate and drive projects
- Experience with AWS integrations such as Kinesis, Firehose, Aurora Unload, Redshift, Spectrum, Elastic Mapreduce, SageMaker, and Lambda
- Experience provisioning data sets for analytics tools such as Tableau, Quicksight, or similar, and knowledge of analytic tools such as R, Tableau, Plotly, and Python Pandas
- Expert SQL skills (including performance tuning, indexes, and materialized views) and proficiency designing and executing NoSQL databases to optimize big data storage and retrieval
- Experience with API integrations with external vendors to push/pull data between organizations, and familiarity with data orchestration pipelines using Argo or Airflow
Benefits
- Medical, dental, vision and life insurance
- Retirement savings – 401(k) plan with generous company matching contributions (up to 6%)
- Tuition reimbursement up to $5,250/year
- Business-casual environment that includes the option to wear jeans
- Generous paid time off upon hire – including a paid time off program plus ten paid company holidays and three floating holidays each calendar year
- Paid volunteer time — 16 hours per calendar year
- Leave of absence programs – including paid parental leave, paid short- and long-term disability, and Family and Medical Leave (FMLA)
- Business Resource Groups (BRGs) – BRGs facilitate inclusion and collaboration across our business internally and throughout the communities where we live, work and play
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
ETL pipelinesPythonSQLNoSQL databasesdata qualitydata engineeringdata integrationdata orchestrationperformance tuningtest case validation
Soft skills
leadershipcommunicationinterpersonal skillsproblem-solvingproject managementteam supervisiongoal-settingresource allocationcollaborationtechnical leadership
Certifications
Bachelor of Science in Computer Science