
Data DevOps Engineer
The Leaflet
full-time
Posted on:
Location Type: Remote
Location: Remote • 🇵🇱 Poland
Visit company websiteJob Level
Mid-LevelSenior
Tech Stack
AirflowAWS
About the role
- Design, build, and optimize data pipelines using Airflow, DBT, and Databricks.
- Monitor and improve pipeline performance to support real-time and batch processing.
- Manage and optimize AWS-based data infrastructure, including S3 and Lambda, as well as Snowflake.
- Implement best practices for cost-efficient, secure, and scalable data processing.
- Enable and optimize AWS SageMaker environments for ML teams.
- Collaborate with ML, Data Science, and Reporting teams to ensure seamless data accessibility.
- Implement data pipeline monitoring, alerting, and logging to detect failures and performance bottlenecks.
- Build automation to ensure data quality, lineage tracking, and schema evolution management.
- Participate in incident response, troubleshooting, and root cause analysis for data issues.
- Advocate for DataOps best practices, driving automation, reproducibility, and scalability.
- Document infrastructure, data workflows, and operational procedures.
Requirements
- Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience.
- 3+ years of experience in DevOps, DataOps, or similar.
- Proficiency in key technologies, including Airflow, Snowflake, and SageMaker.
- Certifications in AWS/Snowflake/other technologies a plus.
- Excellent communication and interpersonal skills.
- Ability to work in a fast-paced environment and manage multiple priorities effectively.
Benefits
- Professional development opportunities
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data pipelinesAirflowDBTDatabricksAWSS3LambdaSnowflakeAWS SageMakerDataOps
Soft skills
communicationinterpersonal skillsability to manage multiple priorities
Certifications
AWS certificationSnowflake certification