Zeta Global

Staff Data Engineer

Zeta Global

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $170,000 - $200,000 per year

Job Level

About the role

  • Design and build a centralized semantic data layer using Cube Core (or equivalent technology such as Headless BI, dbt Metrics Layer, or Metriql) that provides a unified, governed abstraction over all company data sources.
  • Define semantic models, metrics, dimensions, and relationships that map to business domains across marketing, advertising, identity resolution, and customer analytics.
  • Expose the semantic layer via REST/GraphQL APIs and MCP-compatible tool interfaces purpose-built for consumption by AI agents and LLMs.
  • Integrate and unify data from heterogeneous systems including MySQL, DynamoDB, Aerospike, Snowflake, Amazon S3 (data lakes), Apache Kafka, Amazon SQS, and other internal data stores.
  • Build connectors, adapters, and federation layers to query across both operational (OLTP) and analytical (OLAP) data sources in a performant, cost-efficient manner.
  • Ensure seamless handling of both data at rest (warehouses, lakes, databases) and data in motion (streaming platforms, event buses, message queues).

Requirements

  • 10+ years of experience in data engineering, data architecture, or platform engineering, with at least 3 years operating at a Staff/Principal level.
  • Deep hands-on expertise with multiple data stores: relational (MySQL/PostgreSQL), NoSQL (DynamoDB, Aerospike, MongoDB), cloud data warehouses (Snowflake, BigQuery, Redshift), and data lakes (S3, Delta Lake, Iceberg).
  • Strong experience with streaming and messaging systems: Apache Kafka, Amazon SQS/SNS, Kinesis, or equivalent.
  • Proven experience building or operating semantic/metrics layers using Cube.js/Cube Core, dbt Metrics, LookML, or similar technologies.
  • Expert-level SQL skills and experience with query optimization across distributed systems.
  • Production experience designing multi-tenant data platforms with strict security and isolation requirements.
  • Strong understanding of data governance, access control models (RBAC, ABAC), and compliance frameworks (SOC 2, GDPR, CCPA).
  • Experience designing and exposing APIs (REST, GraphQL) for data consumption at scale.
  • BS/MS in Computer Science, Data Engineering, or equivalent practical experience.
Benefits
  • Unlimited PTO
  • Excellent medical, dental, and vision coverage
  • Employee Equity
  • Employee Discounts, Virtual Wellness Classes, and Pet Insurance And more!!
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringdata architectureplatform engineeringSQLquery optimizationsemantic modelsmetrics layersdata governancedata integrationdata modeling
Soft Skills
leadershipcommunicationproblem-solvingcollaborationorganizational skills
Certifications
BS in Computer ScienceMS in Computer ScienceData Engineering certification