Build, enhance, and maintain data pipelines and dashboards that drive transparency and optimization within our infrastructure cost program (across AWS, Azure, and related platforms).
Support reporting and data development across AI enablement, engineering productivity, product usage, and other R&D-focused initiatives.
Partner with engineers and technical stakeholders to define, track, and optimize actionable metrics; participate in metric design, not just execution.
Apply strong SQL, dbt, and Python skills to automate measurement, ensure data quality, and maintain reliable operational metrics.
Deliver high-frequency reporting, provide ad hoc analyses, and enable rapid feedback loops for cost and operational effectiveness.
Work closely with engineers and finance leadership to define, track, and optimize the metrics that matter most for R&D efficiency and innovation.
Requirements
3+ years of experience working with large-scale data in analytical or product-oriented environments, with a strong focus on data exploration, interpretation, and communication.
Proficiency in writing complex SQL queries and building clean, reliable data models to support reporting and analysis.
Hands-on experience using DBT (Data Build Tool) to transform and organize data for downstream analytics workflows.
Strong grasp of data analyst best practices, including analysis reproducibility, validating results through testing and sanity checks, and communicating insights clearly to both technical and non-technical audiences.
Experience with at least one programming language (e.g., Python or R) for data exploration, statistical analysis, and automating reporting workflows; familiarity with DBT and/or Databricks is a plus.
Bachelor's degree in a quantitative field such as Data Science, Mathematics, Statistics, Computer Science, Information Systems, or a related discipline.
Demonstrated ability to translate ambiguous business questions into structured analyses and data models optimized for insight generation.
Strong track record of collaboration with cross-functional partners to deliver high-impact data solutions.
Preferred but not required: MS degree in Data Science, Mathematics, Statistics, Computer Science, Information Systems or other related engineering field.
Preferred but not required: Experience integrating and analyzing cloud cost and operational datasets (AWS, Azure, Databricks).
Preferred but not required: Experience with predictive modeling or statistical analysis techniques to support deeper insights and forecasting.
Preferred but not required: Proficiency with business intelligence tools (e.g., Looker, Tableau, Power BI) to build intuitive dashboards and communicate insights effectively.
ATS Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.