Alpaca

Director of Data

Alpaca

full-time

Posted on:

Location Type: Remote

Location: Anywhere in North America

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Lead and develop three sub-teams: Platform Engineering & ETL, Analytics Engineering, and Data Science & Analytics. Manage leads, set priorities, and ensure delivery.
  • Own the Data Lakehouse architecture: Trino, Iceberg/GCS, Airflow, Airbyte, Redpanda CDC, dbt. Make build-vs-buy decisions on tooling.
  • Drive partner invoicing accuracy and evolution: ensure invoicing logic is versioned, reproducible, and scales with new pricing mechanisms and product launches.
  • Deliver embedded analytics: expose warehouse data to partners via BrokerDash, SSR pipelines, and API-based reporting. Own row-level security and entitlements.
  • Support product launches with data change management: coordinate data impact analysis for new products (fixed income, global stocks, perps, 24/5 trading) across downstream datasets, dashboards, and reverse ETL.
  • Accelerate self-service: move the organization toward self-serve analytics via semantic layers, data catalogues, and conversational BI so the data team can shift from ad-hoc queries to strategic projects.
  • Guide AI/ML enablement: oversee enterprise AI search, agent-based workflow automation, and LLM-powered analytics. Help balance vendor solutions with in-house development.
  • Collaborate with Finance, Sales, Product, Compliance, and Customer Success to translate business needs into data products.
  • Manage infrastructure costs: keep data + cloud cost ratio under target as AUC grows.
  • Operate production systems: own on-call processes, incident response, and SLOs for data freshness, accuracy, and availability.

Requirements

  • 8+ years in data engineering or analytics, including 3+ years managing data teams (leads + ICs).
  • Deep experience with modern data stack: dbt, Trino/Presto or equivalent query engines, Apache Iceberg or similar table formats, cloud object storage.
  • Hands-on experience with ETL/ELT patterns at scale: CDC (Debezium/Kafka), batch (Airflow/dbt), streaming, and reverse ETL.
  • Track record of building self-service analytics capabilities for non-technical stakeholders.
  • Experience with financial data: trading, invoicing, revenue attribution, or regulatory reporting in fintech or financial services.
  • Proficiency in Python and SQL. Comfortable reading code, reviewing PRs, and making architecture decisions.
  • Experience managing distributed/remote teams across multiple time zones.
  • Strong stakeholder management: you can translate between executive priorities and engineering execution.
  • Experience with GCP (GKE, GCS, BigQuery migration), Kubernetes, Helm, Terraform.
Benefits
  • Competitive Salary & Stock Options
  • Health Benefits
  • New Hire Home-Office Setup: One-time USD $500
  • Monthly Stipend: USD $150 per month via a Brex Card
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringanalyticsETLELTCDCbatch processingstreamingPythonSQLself-service analytics
Soft Skills
stakeholder managementteam managementprioritizationcommunicationcollaborationincident responsedata impact analysisstrategic project managementleadershiptranslation of business needs