Design, build, and maintain data pipelines that integrate multiple systems (application, marketing, CRM, finance) into a central data warehouse.
Develop and manage data models and transformations using dbt, ensuring clear lineage, testing, and documentation.
Implement and monitor data orchestration workflows (dbt, Airflow, or AWS tools like Kinesis) for reliability and freshness.
Partner with application engineers (e.g., in a Ruby on Rails environment) to define and capture analytics events and improve data collection.
Maintain data quality, consistency, and validation — identifying anomalies and ensuring trustworthy outputs.
Use SQL (Postgres, Snowflake) and Python for analysis, debugging, and automation.
Collaborate with third-party partners such as Sundial and data vendors to integrate external data sources effectively.
Drive adoption of self-service analytics, enabling product, marketing, and finance teams to independently explore and use data.
Champion data best practices — from documentation and transparency to governance and reproducibility.
Leverage AI-assisted tools for data discovery, documentation, and code generation to accelerate development and improve data quality.
Experiment with AI-powered analytics and automation to surface insights faster and enhance self-service data experiences across teams.
Requirements
5+ years of experience as a Data Engineer or similar role in analytics or data infrastructure.
Proven expertise with SQL (especially Postgres and Snowflake) for data modeling, validation, and analysis.
Experience with dbt and modern data orchestration tools (Airflow, AWS Kinesis, etc.).
Strong understanding of data modeling principles — fact/dimension design, normalization vs. denormalization, and schema optimization.
Proficiency in Python for data manipulation, testing, and pipeline automation.
Hands-on experience with AWS data tools and cloud-native data ecosystems.
Demonstrated ability to partner cross-functionally with engineering, analytics, finance, and marketing teams.
Excellent communication and documentation skills — able to make complex data systems understandable and accessible.
A rigorous, skeptical approach to data analysis: you validate, cross-check, and verify before drawing conclusions.
Curiosity about emerging AI technologies and how they can optimize data engineering workflows, analytics, and business intelligence.
Comfort using AI coding assistants or LLM-powered tools (e.g., for SQL generation, dbt documentation, or debugging) as part of your development process.
Passion for building self-service data solutions that empower others to be data-informed.
Bonus: Experience integrating systems like HubSpot, Stripe, or marketing automation platforms.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
communication skillsdocumentation skillscross-functional collaborationdata analysiscuriosityproblem-solvingattention to detaildata governanceself-service data solutionscritical thinking