Transform business requirements into conceptual, logical and physical data models for AI initiatives and the Abacus data model and metadata.
Use Databricks Unity Catalog to build a semantic layer, manage AI assets and registered metric definitions.
Leverage AI Assistant and SQL Agents to build and test the semantic layer.
Build conceptual, logical and physical data models aligned with enterprise architecture, business requirements and industry standards.
Follow data modeling best practices, including schema evolution, lossless data modeling, metadata management, and data versioning.
Use enterprise data modeling tools (Erwin, ER/Studio, DBT, or similar).
Ensure models support analytical (data science, actuarial, reporting) and operational (Claims, Care Management) use cases.
Maintain metadata, business glossary, entity relationship diagrams and data dictionaries to support data lineage and governance.
Partner with Client Management, Product, and Business Analyst teams to understand client needs, assess data model impact, and support client implementation.
Perform data profiling analysis.
Use SQL, data modeling and data integrity principles.
Collaborate with Director of Data Modeling and Senior Data Modelers to adhere to data modeling standards, metadata governance, and data architecture strategies.
Requirements
Strong hands-on experience with Databricks Unity Catalog, cloud data platforms (Databricks, Snowflake) and data Lakehouse architectures.
Healthcare reporting or data engineering experience.
Experience with data governance, security, and regulatory compliance in healthcare data management.
2+ years of experience in data modeling, and cloud-based data engineering or reporting.
Experience with enterprise data modeling tools (Erwin, ER/Studio, DBSchema, or similar).
Understanding of healthcare data across Core Payer (Claims, Membership, Enrollment, Provider, Billing, Premium) and Clinical Interoperability (FHIR, HL7, ADT, CCDA, etc.) domains.
Experience with schema evolution, lossless data modeling, and data versioning strategies.
Strong background in SQL, data profiling, and metadata creation and maintenance.
Ability to learn from senior modelers, enforce modeling best practices, and drive adoption of enterprise data standards.
Excellent collaboration skills to work with data architects, engineers, and client implementation teams.
NoSQL data model experience (bonus).
Python coding skills (bonus).
Knowledge of APIs, SDKs, and distributed data frameworks for enabling real-time and micro-batch processing of structured, semi-structured and unstructured data (bonus).