
Senior Data Engineer
Airspace
full-time
Posted on:
Location Type: Hybrid
Location: San Diego • California • United States
Visit company websiteExplore more
Salary
💰 $130,000 - $180,000 per year
Job Level
About the role
- Own and evolve our data infrastructure and pipelines
- Design, build, and maintain reliable ETL pipelines that ingest data from internal application postgres databases and external SaaS platforms (e.g. Salesforce, Twilio, Zendesk)
- Manage and scale our orchestration layer built in Airflow
- Ensure reliability, consistency, and performance across our data systems through strong engineering practices and operational discipline
- Oversee and optimize how data flows into and through our warehouse layer (Snowflake), ensuring transformations (via dbt) integrate cleanly into upstream and downstream systems
- Be a technical leader and strategic architect
- Set a high bar for technical excellence, promoting clean architecture, code quality, and maintainability through thoughtful design, hands-on coding, and code reviews
- Drive architectural improvements and lead system-level refactors to improve performance, scalability, and efficiency
- Continuously evaluate and recommend tools, frameworks, and methodologies that enhance platform capabilities and developer velocity
- Level up our data function across the full lifecycle
- Act as a trusted partner to stakeholders in Product, Data Science, and Infrastructure, helping shape technical roadmaps that align with business priorities
- Identify and resolve systemic bottlenecks in our data workflows, and proactively implement process improvements
- Mentor teammates and foster a collaborative, inclusive, and high-performing engineering culture.
Requirements
- 6+ years of experience in data engineering or backend infrastructure roles
- Expert level Python engineering skills
- Thorough knowledge and understanding of Airflow (familiarity with Astronomer or Google Cloud Composer is a plus)
- Experience managing and optimizing data infrastructure built on Snowflake (preferred) or BigQuery, including warehouse design, performance tuning, and cost efficiency
- Familiarity with dbt core, including how it fits into modern data pipelines and transformation layers
- Solid understanding of SQL (especially in the context of large-scale warehouse environments)
- Proficiency in Git, version control, and collaborative development workflows
- Comfort working with CI/CD pipelines and deployment automation
- Experience ingesting data from both internal and third-party systems.
Benefits
- High-quality health, dental, and vision plan options
- Unlimited PTO
- 401K with company match
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonETLAirflowSnowflakedbtSQLGitCI/CDdata engineeringbackend infrastructure
Soft Skills
technical leadershipstrategic architecturecode qualitymentoringcollaborationproblem-solvingprocess improvementcommunicationoperational disciplinestakeholder management