
Data Product Architect
qode.world
full-time
Posted on:
Location Type: Hybrid
Location: Pennsylvania • Ohio • Pennsylvania • United States
Visit company websiteExplore more
About the role
- Define end-to-end architecture for data products from source systems through analytics and downstream consumption.
- Design and govern logical, physical, and semantic data models (facts, dimensions, metrics, hierarchies).
- Apply domain-driven and data-product design principles to ensure consistency and reusability.
- Establish and govern data contracts and domain interfaces.
- Define architectural patterns across Hadoop, lakehouse, and streaming platforms.
- Guide batch, near-real-time, and event-driven designs using Spark and Kafka.
- Ensure alignment across on-prem and cloud-based platforms in a hybrid enterprise environment.
- Review and guide ingestion and data service designs built on Java/Spring Boot and Python.
- Architect Kafka-based pipelines for decoupled, event-driven data products.
- Apply graph modeling patterns where relationship-centric use cases require it.
- Define enterprise semantic models supporting BI and analytics tools (Power BI, Fabric, Tableau).
- Ensure consistent business definitions and metrics across reporting and analytics.
- Enable one-to-many consumption where a single data product supports multiple use cases.
- Embed data quality, lineage, metadata, and observability into architectural designs.
- Partner with centralized governance, security, and risk teams to meet regulatory requirements.
- Define data product ownership, stewardship, and lifecycle standards.
- Act as the architectural authority for data products within the organization.
- Review and approve solution designs and reference implementations.
- Bridge enterprise architecture standards with delivery execution across teams.
Requirements
- 12+ years of experience in data architecture, data engineering, or analytics architecture.
- Proven experience designing enterprise-scale data products and platforms.
- Strong expertise in data modeling, lakehouse architectures, and streaming systems.
- Excellent communication skills with technical and business stakeholders.
- Bachelor’s degree in Computer Science, Engineering, or related field (Master’s preferred).
- Proficiency in Data Platforms: Hadoop, modern lakehouse architectures.
- Experience with Streaming & Processing: Spark, Spark Streaming, Kafka.
- Programming skills: Java, Spring Boot (design/review), Python.
- Knowledge of Modeling & Analytics: Dimensional, canonical, domain-driven modeling; semantic layers.
- Familiarity with Observability: Data observability and operational monitoring (ELK preferred).
- Understanding of Governance & Security: Data governance, lineage, quality, and compliance.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data architecturedata engineeringanalytics architecturedata modelinglakehouse architecturestreaming systemsJavaSpring BootPythondata observability
Soft Skills
communication skills