
Senior Data Engineer – Cost Platform
Spotify
full-time
Posted on:
Location Type: Hybrid
Location: New York City • New York • United States
Visit company websiteExplore more
Salary
💰 $164,448 - $234,926 per year
Job Level
Tech Stack
About the role
- Be responsible for the design, implementation, and evolution of scalable, reliable data infrastructure that underpins Spotify’s cost and carbon intelligence.
- Own end-to-end data pipelines for cloud cost, usage, and emissions data: spanning ingestion, transformation, modeling, and serving layers.
- Partner deeply with Data Scientists, Engineering, Finance, and Procurement to translate sophisticated analytical and business needs into robust data architectures.
- Set technical direction and standards for data modeling, orchestration, testing, and observability within the Cost Engineering domain.
- Build and maintain curated, analytics-ready datasets that power executive reporting, forecasting, and optimization initiatives.
- Ensure data accuracy, consistency, and timeliness for high-stakes cost and emissions reporting used to guide strategic infrastructure investments.
- Proactively identify opportunities to improve the scalability, reliability, and cost efficiency of the data platform itself.
- Mentor other engineers and act as a technical sounding board, raising the overall bar for data engineering perfection on the team.
- Work across all Missions at Spotify to embed cost and climate awareness into decision-making, with a focus on accurate attribution of spend and carbon impact.
Requirements
- A senior data engineer with a strong track record of owning and operating production-critical data systems end to end.
- Hold a degree in computer science, engineering, or a related technical field, or equivalent proven experience.
- Experienced in designing data architectures that scale with both data volume and organizational complexity.
- Comfortable leading technical discussions, influencing build decisions, and aligning partners around long-term solutions.
- Thrive in environments with evolving requirements, balancing speed of delivery with adaptability and correctness.
- Strong communicator who can explain sophisticated technical concepts clearly to both technical and non-technical audiences.
- Familiar with financial, billing, or usage data, and able to connect infrastructure metrics to real business and sustainability impact.
- Hands-on experience with cloud data platforms (GCP preferred).
- Highly proficient with Python, SQL, DBT, and modern orchestration frameworks and experienced with data quality and observability tooling.
- Experience with at least one data processing framework such as Spark, Flink, or Dataflow
Benefits
- health insurance
- six month paid parental leave
- 401(k) retirement plan
- monthly meal allowance
- 23 paid days off
- 13 paid flexible holidays
- paid sick leave
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data engineeringdata architecturedata modelingdata pipelinesdata transformationdata qualityPythonSQLDBTdata processing frameworks
Soft Skills
communicationleadershipmentoringcollaborationadaptabilityproblem-solvinginfluencingtechnical discussionclarity in explanationbalancing speed and correctness