
Senior Engineer – Data DevOps
The Leaflet
full-time
Posted on:
Location Type: Remote
Location: Florida • United States
Visit company websiteExplore more
Job Level
About the role
- Architect and lead the design of complex, enterprise-scale data pipelines using Airflow, DBT, and Databricks
- Define and implement strategies for pipeline performance optimization to support real-time and batch processing at scale
- Lead the design and optimization of AWS-based data infrastructure, including S3, Lambda, and Snowflake architecture
- Establish and enforce best practices for cost-efficient, secure, and scalable data processing across the organization
- Design and optimize AWS SageMaker environments for ML teams, ensuring optimal performance and resource utilization
- Lead cross-functional collaboration with ML, Data Science, and Reporting teams to establish data strategy and ensure seamless data accessibility
- Design and implement comprehensive data pipeline monitoring, alerting, and logging frameworks to proactively detect failures and performance bottlenecks
- Architect automation solutions for data quality, lineage tracking, and schema evolution management
- Lead incident response efforts, performing complex troubleshooting and root cause analysis for critical data issues
- Champion and evolve Data DevOps best practices, driving automation, reproducibility, and scalability across the organization
- Mentor junior and mid-level engineers, conducting code reviews and providing technical guidance
- Establish technical standards, document complex infrastructure patterns, data workflows, and operational procedures
- Evaluate and recommend new technologies and tools to improve data infrastructure and workflows.
Requirements
- Bachelor’s degree in computer science, Information Technology, or a related field, or equivalent work experience
- Master's degree preferred
- 6+ years of experience in DevOps, DataOps, or similar roles, with at least 2 years in a senior or lead capacity
- Deep expertise in key technologies, including Airflow, Snowflake, SageMaker, DBT, and Databricks
- Strong architectural experience with AWS services and cloud-native design patterns
- Proven track record of leading technical projects and mentoring engineering teams
- Certifications in AWS/Snowflake/other technologies strongly preferred
- Excellent communication, leadership, and interpersonal skills with demonstrated ability to influence technical direction
- Experience working in a fast-paced, high-velocity environment while managing multiple strategic priorities effectively
- Strong problem-solving skills with the ability to navigate ambiguity and make sound technical decisions.
Benefits
- Competitive pay and benefits
- Flexible vacation allowance
- Flexible work from home or office hours
- Startup culture backed by a secure, global brand
- Opportunity to build products enjoyed by millions as part of a passionate team
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data pipelinesperformance optimizationdata infrastructuredata processingdata pipeline monitoringautomation solutionstroubleshootingroot cause analysisData DevOpsschema evolution management
Soft skills
leadershipcommunicationinterpersonal skillsmentoringproblem-solvinginfluencingcollaborationstrategic prioritizationtechnical guidancedocumentation
Certifications
AWS certificationSnowflake certification