
Senior Data Engineer, Product Analytics
DataRobot
full-time
Posted on:
Location Type: Remote
Location: California • Massachusetts • United States
Visit company websiteExplore more
Job Level
Tech Stack
About the role
- Architect and deliver scalable, reliable data warehouses, analytics platforms, and integration solutions.
- Critical role in supporting our internal AI strategy.
- Partner with Product Manager, Analytics to shape our project roadmap and lead its implementation.
- Collaborate with and mentor cross-functional teams to design and execute sophisticated data software solutions that elevate business performance and align to coding standards and architecture.
- Develop, deploy, and support analytic data products, such as data marts, ETL jobs (extract/transform/load), functions (in Python/SQL/DBT) in a cloud data warehouse environment using Snowflake, Stitch/Fivetran/Airflow, AWS services (e.g., EC2, lambda, kinesis).
- Navigate various data sources and efficiently locate data in a complex data ecosystem.
- Work closely with data analysts, and data scientists to build models and metrics to support their analytics needs.
- Data modeling enhancements caused by upstream data changes.
- Instrument telemetry capture and data pipelines for various environments to provide product usage visibility.
- Maintain and support deployed ETL pipelines and ensure data quality.
- Develop monitoring and alerting systems to provide visibility into the health of data infrastructure, cloud applications, and data pipelines.
- Partner with the IT enterprise applications and engineering teams on integration efforts between systems that impact data & Analytics.
- Work with R&D to answer complex technical questions about product analytics and corresponding data structure.
Requirements
- 5-7 years of experience in a data engineering or data analyst role.
- Experience building and maintaining product analytics pipelines, including the implementation of event tracking (e.g., Snowplow) and the integration of behavioral data into Snowflake from platforms like Amplitude.
- Strong understanding of data warehousing concepts, working experience with relational databases (Snowflake, Redshift, Postgres, etc.), and SQL.
- Experience working with cloud providers like AWS, Azure, GCP, etc.
- Solid programming foundations and proficiency in data-related languages like Python, Scala, and R.
- Experience with DevOps workflows and tools like DBT, GitHub, Airflow, etc.
- Experience with an infrastructure-as-code tool such as Terraform or CloudFormation.
- Excellent communication skills.
- Ability to effectively communicate with both technical and non-technical audiences.
- Knowledge of real-time stream technologies like AWS Firehose, Spark, etc.
- Highly collaborative in working with teammates and stakeholders.
- AWS cloud certification is a plus.
- BA/BS preferred in a technical or engineering field.
Benefits
- Medical, Dental & Vision Insurance
- Flexible Time Off Program
- Paid Holidays
- Paid Parental Leave
- Global Employee Assistance Program (EAP) and more!
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data engineeringdata analysisETLdata modelingSQLPythonScalaRDevOpsdata warehousing
Soft skills
communicationcollaborationmentoringleadershipproblem-solvingproject managementinterpersonal skillstechnical communicationorganizational skillsstakeholder engagement
Certifications
AWS cloud certification