
Senior Data Architect
Irth Solutions
full-time
Posted on:
Location Type: Remote
Location: United States
Visit company websiteExplore more
Job Level
About the role
- Design the high‑level and detailed architecture blueprint for Irth’s new Databricks‑based data estate, including lakehouse patterns, medallion architecture (Bronze/Silver/Gold), and semantic layers.
- Develop a multi‑cloud ingestion and consolidation strategy for data sourced from AWS (S3, RDS, DynamoDB), Azure (SQL MI, Storage Accounts), GCP, and on‑prem systems.
- Define patterns for data lineage, metadata management, data quality, CDC/SCD, streaming and batch ingestion, and federated query capabilities.
- Establish an enterprise‑level data governance model, including classification schemas, residency rules, cataloging (Unity Catalog), retention policies, and security standards.
- Partner with security and compliance teams to enforce GDPR, PIPEDA, Australian Privacy Act, SOC 2, and other regulatory requirements.
- Architect for scalability, reliability, cost‑efficiency, and support of downstream analytics, reporting, GIS workflows, and AI/ML workloads.
- Produce architectural documentation, diagrams, runbooks, and design standards / best practices.
- Collaborate closely with a distributed team of data engineers to implement ingestion pipelines, Delta Lake storage patterns, orchestration workflows, and Databricks Lakeflow components.
- Provide technical leadership, code reviews, architectural guidance, and hands‑on support when needed.
- Ensure architecture is implemented consistently across tenants, clouds, products, and engineering teams.
- Work with DevOps/Platform teams on infrastructure, identity, networking, VPC/VNet, and CI/CD integration.
- Establish monitoring, observability, and operational governance for the data estate.
- Define and enforce RBAC/ABAC policies, encryption standards, audit logging, and tenant‑level isolation.
- Partner with governance teams to unify metadata and lineage across all sources via Unity Catalog and Purview integration.
- Ensure ongoing compliance with data residency and cloud‑specific regulatory constraints.
- Develop standards for retention, archival, disaster recovery, and cross‑region protections.
- Architect support for operational reporting, BI (Power BI, Databricks SQL), semantic modeling, and consumption patterns.
- Design foundations enabling AI/ML use cases such as predictive risk modeling, summarization, geospatial analytics, and LLM‑driven insights.
- Advance a BI‑tool‑agnostic semantic model strategy for consistent business definitions across products.
- Serve as Irth’s senior expert on Databricks Lakehouse architecture, modern data engineering, and multi‑cloud data strategy.
- Mentor engineers and analysts in best practices for data modeling, governance, pipeline design, and scalable architectures.
- Partner with product teams, executives, and stakeholders to translate business requirements into technical data solutions.
- Participate in vendor evaluations, roadmap planning, and architectural governance boards.
Requirements
- 8+ years of experience in data architecture, data engineering, or platform architecture, with at least 3 years in a senior or lead role.
- Deep expertise with Databricks (Azure preferred), including Delta Lake, Unity Catalog, SQL Warehouses, Lakehouse design, and pipeline orchestration.
- Strong experience building multi‑cloud or hybrid data architectures (AWS, Azure, GCP).
- Expert in medallion architecture, lakehouse design patterns, metadata management, lineage, and enterprise governance.
- Advanced SQL and experience with one or more: Python, Scala, PySpark, Spark SQL.
- Strong understanding of security, compliance, RBAC/ABAC, encryption, and data residency considerations.
- Experience with CDC, SCD Type 1/2, streaming ingestion, batch pipelines, and workload performance optimization.
- Demonstrated ability to translate highly complex technical concepts to non‑technical audiences.
- Strong documentation, diagramming, architectural review, and communication skills.
- Preferred
- Experience with Unity Catalog–Purview integration and enterprise metadata strategies.
- Background in geospatial data, operational risk modeling, or AI/ML enablement.
- Knowledge of Power BI semantic modeling or BI‑platform‑agnostic semantic layer design.
- Experience with highly regulated or multi‑tenant data environments.
- Certifications: Databricks Data Engineer Professional, Databricks Architect, Azure Data Architect, AWS/GCP data certifications.
Benefits
- Join a dynamic, growing company that is well respected in its industry.
- Competitive salary
- Health plans options including medical, dental, & vision
- 401k (US), RSP (Canada) + company match
- Flexible PTO policy plus company-paid holidays
- Benefits options such as health insurance, life insurance, discounts and perks programs
- Generous “work from home” stipend to get you started
- Team events including monthly lunches for everyone, volunteer outings, and quarterly gatherings
- Hybrid employees have access to snacks, beverages and coffee at our Columbus office
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data architecturedata engineeringplatform architectureDatabricksDelta LakeUnity CatalogSQLPythonScalaPySpark
Soft Skills
technical leadershipcommunication skillsdocumentationdiagrammingarchitectural reviewmentoringcollaborationtranslating technical conceptsproblem-solvingorganizational skills
Certifications
Databricks Data Engineer ProfessionalDatabricks ArchitectAzure Data ArchitectAWS data certificationGCP data certification