Lead and grow a team of data engineers responsible for scalable, reliable, and secure ingestion, transformation, and pipeline management across batch and streaming systems.
Drive technical excellence in the design and operation of Airflow, Kafka, Spark, and Iceberg-based data workflows, ensuring data freshness, completeness, and quality SLAs are consistently met.
Partner with Product Management, Data Infrastructure, and Analytics Engineering to define the roadmap for ingestion, self-service pipeline automation, and data quality frameworks aligned to business goals.
Establish and track operational metrics to improve reliability and visibility of data systems.
Build a strong engineering culture focused on craftsmanship, ownership, and learning—mentoring engineers through design reviews, incident retrospectives, and technical deep dives.
Collaborate cross-functionally to develop declarative, self-service tools that reduce dependencies on central teams and improve “time to insight” for internal stakeholders.
Contribute to longer-term architectural strategy for the unified data lakehouse, data catalog, and real-time infrastructure that will power Webflow’s next generation of AI and ML use cases.
Requirements
You have 2+ years of experience leading high-performing engineering teams, driving both people development and technical execution.
Have a proven track record leading data engineering teams that design, build, and operate large-scale ingestion, transformation, and pipeline orchestration systems, with strong expertise in Airflow, Spark, Kafka, and modern lakehouse architectures like Iceberg on AWS.
Navigate complex distributed data systems with confidence, making thoughtful trade-offs between cost, latency, reliability, and velocity.
Balance operational excellence with pragmatic delivery — knowing when to invest in refactoring and when to ship
Bring deep technical empathy for data engineers — you understand what it takes to design resilient pipelines, debug complex workflows, and maintain quality at scale.
Build high-trust, cross-functional relationships that drive alignment, accountability, and transparency across Data Infrastructure, Analytics, and partner teams.
Communicate with clarity and influence, connecting technical decisions to business outcomes and stakeholder needs.
Champion self-service, automation, and sustainable engineering practices that scale with Webflow’s growing data ecosystem.
Stay curious and growth-oriented — continuously learning new technologies like AI-driven data quality, real-time analytics, and lakehouse optimization to unlock creativity, accelerate progress, and amplify impact.
Benefits
Equity ownership (RSUs) in a growing, privately-owned company
100% employer-paid healthcare, vision, and dental insurance coverage for full-time employees (working 30+ hours per week) and their dependents. Full-time employees may also be eligible for voluntary insurance options where applicable in the respective country of employment
12 weeks of paid parental leave for both birthing and non-birthing caregivers, as well as an additional 6-8 weeks of pregnancy disability leave for birthing parents to be used before child bonding leave (note: where local requirements are more generous, employees receive the greater benefit); full-time employees also have access to family planning care and reimbursement
Flexible PTO for all locations and sabbatical program
Access to mental wellness and professional coaching, therapy, and Employee Assistance Program
Monthly stipends to support work and wellness
401k plan or pension schemes (in countries where statutorily required), and other financial wellness benefits, like CPA and financial advisor coverage
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data engineeringpipeline orchestrationdata ingestiondata transformationAirflowKafkaSparkIcebergAWSlakehouse architecture