NBCUniversal

Data Modeler, Engineer 3

NBCUniversal

full-time

Posted on:

Location Type: Hybrid

Location: West ChesterPennsylvaniaUnited States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design and maintain conceptual, logical, and physical data models for domain-owned data products
  • Translate product requirements into consumer-friendly analytical models, entities, and metrics
  • Build domain-aligned, analytics-ready data structures optimized for modern access patterns
  • Define and maintain semantic layers and shared business definitions across data products
  • Align data product models with modern data architecture patterns (lakehouse, multi-layered data platforms)
  • Partner with data platform and engineering teams to ensure data products are efficiently implemented in Snowflake and Apache Spark
  • Contribute to standards for schema evolution, data contracts, and backward compatibility
  • Work closely with data product managers, analytics engineers, and business stakeholders to refine product requirements
  • Act as a modeling subject-matter expert within data product teams
  • Support onboarding and adoption of data products by downstream consumers
  • Embed data quality, consistency, and usability into data product designs
  • Maintain clear documentation for data products, models, and metrics
  • Support metadata, lineage, and discoverability initiatives aligned to data product governance

Requirements

  • 5+ years of experience in data modeling, analytics engineering, or data architecture roles.
  • Strong proficiency in conceptual, logical, and physical data modeling techniques.
  • Hands-on experience with Snowflake, including schema design, performance optimization, and cost governance.
  • Experience developing data models and transformations in Apache Spark (Spark SQL, PySpark preferred).
  • Familiarity with modern data architectures (e.g., lakehouse, domain-oriented data products, multi‑layered data platforms).
  • Experience working with modeling tools such as Erwin or SAP PowerDesigner.
  • Knowledge of schema evolution practices, data contracts, and metadata/lineage standards.
  • Proven ability to collaborate with cross-functional data and business teams.
  • Strong understanding of semantic modeling and shared business metric design.
  • Experience working in cloud‑native environments (AWS, Azure, or GCP).
  • Ability to balance scalability, performance, quality, and usability in data product design.
  • Excellent communication skills with the ability to explain complex data concepts to non‑technical audiences.
Benefits
  • array of options
  • expert guidance
  • always-on tools
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data modelinganalytics engineeringdata architectureschema designperformance optimizationdata transformationssemantic modelingdata qualitydata usabilitymetadata standards
Soft Skills
collaborationcommunicationsubject-matter expertiseproblem-solvingcross-functional teamworkdocumentationstakeholder engagementanalytical thinkingadaptabilityuser-centric design