Lead the design, development, and maintenance of a scalable Snowflake data solution serving our enterprise data & analytics team.
Architect and implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake and related technologies.
Optimize Snowflake database performance, storage, and security.
Provide guidance on Snowflake best practices.
Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and software engineers to define and implement data solutions.
Ensure data quality, integrity, and governance across the organization.
Provide technical leadership and mentorship to junior and mid-level data engineers.
Troubleshoot and resolve data-related issues, ensuring high availability and performance of the data platform.
Perform independent in-depth data analysis and data discovery to understand existing source systems, fact and dimension data models, and implement an enterprise data warehouse solution in Snowflake.
Focus on performance optimization, security, scalability, and Snowflake credit control and management.
Requirements
Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field.
7+ years of experience in-depth data engineering, with at least 3+ minimum years of dedicated experience engineering solutions in a Snowflake environment.
Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques.
Strong experience with cloud platforms (preference to Azure) and their data services.
Proficiency in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, or Fivetran.
Hands-on experience with scripting languages like Python for data processing.
Strong understanding of data governance, security, and compliance best practices.
Snowflake SnowPro certification; preference to the engineering course path.
Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC).
Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming.
Familiarity with BI and visualization tools such as PowerBI.
Familiarity working in an agile scum team, including sprint planning, daily stand-ups, backlog grooming, and retrospectives.
Ability to self-manage large complex deliverables and document user stories and tasks through Azure Dev Ops.
Personal accountability to committed sprint user stories and tasks.
Strong analytical and problem-solving skills with the ability to handle complex data challenges.
Ability to read, understand, and apply state/federal laws, regulations, and policies.
Benefits
As part of our robust Rewards & Recognition program, this role is eligible for our Ventra performance-based incentive plan.
Help Us Grow Our Dream Team — Join Us, Refer a Friend, and Earn a Referral Bonus!
This position is also eligible for a discretionary incentive bonus in accordance with company policies.
Ventra Health is committed to providing reasonable accommodations to qualified individuals with disabilities.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
SnowflakeETLELTANSI SQLdata modelingAzure Data FactoryPythonCI/CDInfrastructure as CodeApache Kafka