Arbor

Data Engineer

Arbor

full-time

Posted on:

Location Type: Remote

Location: District of ColumbiaUnited States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Build and maintain our data pipeline infrastructure
  • Own our dbt transformation layer end-to-end
  • Partner closely with engineering, operations, and leadership to deliver analytics that inform business decisions
  • Work in Hex to build and maintain dashboards
  • Define data contracts and schema standards
  • Shape how we use AI in our data workflows.

Requirements

  • 3–6+ years of experience in a data engineering or analytics engineering role
  • Strong dbt fundamentals
  • Solid SQL knowledge
  • Python for pipeline development or data quality tooling is a plus
  • Hands-on experience with Snowflake
  • Comfort with GCP data services (BigQuery, Cloud Storage, Pub/Sub) is a plus
  • Experience building dashboards for business stakeholders in tools like Hex, Looker, or similar
  • Genuine curiosity about electricity pricing, competitive markets, and the grid work is preferred
  • An AI-native approach to productivity.
Benefits
  • Competitive salary + meaningful equity + benefits
  • Flexible on location but value regular in-person collaboration with the team.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
dbtSQLPythonSnowflakeGCPBigQueryCloud StoragePub/Subdashboard developmentdata quality tooling
Soft Skills
collaborationcommunicationcuriosityproblem-solvingleadership