
Senior Data Engineer – Airflow, PySpark
Detroit Labs
full-time
Posted on:
Location Type: Hybrid
Location: Auburn Hills • Missouri • United States
Visit company websiteExplore more
Salary
💰 $160,000 - $180,000 per year
Job Level
About the role
- Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads
- Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability
- Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions (GCP preferred)
- Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines
- Implement secure coding best practices and design patterns throughout the development lifecycle
- Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions
- Create and maintain technical documentation, including process/data flow diagrams and system design artifacts
- Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices
- Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks
- Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage
Requirements
- 7+ years of Data Engineering experience building production-grade data pipelines using Python and PySpark
- Experience designing, deploying, and managing Airflow DAGs in enterprise environments
- Experience maintaining CI/CD pipelines for data engineering workflows, including automated testing and deployment
- Experience with cloud workflows and containerization, using Docker and cloud platforms (GCP preferred) for data engineering workloads
- Knowledge and ability to follow twelve-factor design principles
- Experience and ability to write object-oriented Python code, manage dependencies, and follow industry best practices
- Proficiency with Git for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).
- Experience working with command lines in Unix/Linux-like environments
- Solid understanding of SQL for data ingestion and analysis
- Engineering mindset. Able to write code with an eye for maintainability and testability
- Collaborative mindset. Comfortable with code reviews, paired programming, and using remote collaboration tools effectively
- Detroit Labs is not currently able to hire candidates who will reside outside of the United States during their term of employment
Benefits
- 📊 Check your resume score for this job Improve your chances of getting an interview by checking your resume score before you apply. Check Resume Score
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonPySparkAirflowCI/CDDockerGCPTDDSQLobject-oriented programmingUnix/Linux
Soft Skills
leadershipmentoringcollaborationproblem-solvingcommunicationdocumentationanalytical thinkingadaptabilityteamworkguidance