Salary
💰 $112,129 - $140,161 per year
Tech Stack
AzureCloudETLKafkaPythonScalaSQLVault
About the role
- Architect, design, and implement advanced, enterprise-grade Snowflake data pipelines and integrations leveraging features such as tasks, streams, external tables, and secure data sharing.
- Lead Snowflake-centric data modeling (dimensional, data vault, and hybrid) to deliver high-performance, cost-optimized solutions.
- Design frameworks for real-time and batch data processing.
- Develop and implement a comprehensive analytics and reporting strategy aligned with Partners and NC DHHS goals and initiatives.
- Develop advanced transformation logic directly in Snowflake using SQL, Snowpark (Python or Scala), and stored procedures.
- Build and maintain reusable semantic models in Power BI for self-service analytics.
- Implement materialized views, clustering keys, and query optimization techniques for maximum performance.
- Deliver reusable Snowflake datasets to power enterprise analytics in Power BI and other BI tools.
- Develop automated monitoring and alerting for data quality and pipeline health.
- Establish and enforce standards for data governance, lineage, metadata, and privacy.
- Implement Snowflake role-based access control (RBAC), dynamic data masking, and row-level security.
- Apply advanced Snowflake governance capabilities, including object tagging, data classification, and audit logging to meet HIPAA and GDPR, and other regulatory requirements.
- Lead CI/CD for Snowflake deployments using Git-based workflows with Azure DevOps or GitHub Actions.
- Implement automated testing and schema change management for Snowflake environments.
- Continuously monitor Snowflake performance and storage usage for cost optimization.
- Perform performance tuning, cost optimization, and scalability planning.
- Mentor junior data engineers and provide technical guidance.
- Collaborate with analysts, developers, and business leaders to translate business needs into scalable solutions.
- Communicate technical concepts to non-technical stakeholders effectively.
Requirements
- Bachelor’s degree in Computer Science, Data Science, Engineering, or related field; or equivalent experience
- 5+ years of experience in data engineering, with at least 3 years of direct Snowflake architecture and development experience
- Proven track record of delivering enterprise-grade Snowflake solutions integrated with Azure
- Experience with Snowpark, Snowflake security administration, and advanced cost optimization techniques
- SnowPro Advanced Architect or SnowPro Core certification (Education/Experience Preferred)
- Healthcare or Medicaid data experience