Salary
💰 $188,000 - $215,000 per year
Tech Stack
AirflowAWSETLPythonSQLTableauTerraformVault
About the role
- Data Platform & Pipelines
- Own all ETL/ELT from external marketplaces and partners (listings and sold history data). Deliver coverage, freshness, and low latency across sources (APIs, feeds, and compliant web scraping).
- Design and operate pipelines ; contribute to our asset AI detection entity resolution, and robust outlier detection/cleansing.
- Set measurable SLOs for freshness/latency/accuracy; build monitoring, alerting, and runbooks for data incidents.
- Own the Alt internal asset catalog and help increase proprietary dataset.
- Analytics Engineering
- Lead the analytics engineering function and partner closely with the Analytics team (analysts). Provide a clear, self‑servable metrics layer and executive dashboards for GMV, liquidity, funnels, seller/buyer behavior, and ops SLAs.
- Own the event taxonomy and product analytics in Amplitude; design A/B experiments.
- Automation & AI Agents
- Map high‑leverage manual workflows (e.g., listing QA, price suggestions, dispute/CS routing, KYC review, payout reconciliation) and deliver AI agents to replace manual workflows.
- Partner with Product, Ops, Risk, and Finance to prioritize automations by ROI and shipped impact.
- Building cross organizational workflows
- Leadership
- Lead and grow a multidisciplinary team (data engineering, analytics engineering, and automation/ML engineers). Manage vendors and budgets.
- Translate business goals into a clear roadmap with quarterly outcomes.
- Tech You’ll Work With
- Warehouse/Lake: Snowflake; object storage (/S3).
- Full text search databases: Typesense
- Orchestration: Airflow; Data streaming
- Transformations & Quality: dbt; Great Expectations/Soda; data catalog.
- Ingestion: Airbyte/Fivetran plus custom Python (Scrapy/Playwright) with proxy mgmt.
- Analytics: Tableau, Metabase (core BI) and Amplitude (taxonomy, funnels, experiments).
- Infra/Dev: AWS; Terraform/IaC; GitHub Actions; Datadog
- AI/Agents: OpenAI/Anthropic or Vertex AI; MCPs, Agentic Frameworks (Langchain, Pydantic AI) ; Embeddings model, RAGs; evaluation/guardrails frameworks.
Requirements
- 8–12+ years in data engineering/analytics, including 4+ years leading managers and cross‑functional teams.
- Built and operated high‑scale data ingestion from heterogeneous sources (APIs, feeds, and web data) with strong SLAs on freshness and accuracy.
- Led an analytics engineering function and partnered with analysts on metric definitions, product analytics instrumentation, experimentation enablement, and exec reporting.
- Hands‑on with Python and SQL; comfortable reviewing PRs and diving into schema/model design.
- Have experience scaling Analytics across organization. Practical experience delivering LLM/agent systems in production with measurable business outcomes.
- Experience working on operational workflows
- Marketplace, fintech, or e‑commerce experience (pricing/valuation, risk, payments) preferred
- Prior success scaling teams in a high‑growth startup
- Bonus: You’re familiar with the collectibles space or alternative asset marketplaces