Medsuite Inc

Snowflake Data Engineer

Medsuite Inc

full-time

Posted on:

Origin:  • 🇮🇳 India

Visit company website
AI Apply
Manual Apply

Job Level

Mid-LevelSenior

Tech Stack

ApacheAzureCloudETLKafkaMatillionPythonSparkSQL

About the role

  • Ventra is a leading business solutions provider for facility-based physicians focused on revenue cycle management
  • Design, develop, and maintain a scalable Snowflake data solution for enterprise data & analytics
  • Implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake and related technologies
  • Optimize Snowflake database performance and ensure scalability
  • Collaborate with data analysts, business analysts, data scientists, and software engineers to define data solutions
  • Ensure data quality, integrity, and governance
  • Troubleshoot and resolve data-related issues to ensure high availability and performance
  • Perform independent data analysis and data discovery of source systems, fact and dimension models
  • Implement enterprise data warehouse solutions following Medallion architecture best practices
  • Take direction from Lead Snowflake Data Engineer and Director of Data Engineering while contributing domain expertise

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field
  • 4+ years of in-depth data engineering experience
  • At least 1+ year of dedicated experience engineering solutions in an enterprise scale Snowflake environment
  • Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques
  • Strong experience with cloud platforms (preference to Azure) and their data services
  • Experience in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, or Fivetran
  • Hands-on experience with scripting languages like Python for data processing
  • Snowflake SnowPro certification (preferred)
  • Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC)
  • Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming
  • Familiarity with BI and visualization tools such as PowerBI
  • Familiarity working in an agile scrum team and using Azure Dev Ops for user stories/tasks
  • Ability to self-manage medium complexity deliverables and strong analytical/problem-solving skills
  • Ability to read, understand, and apply state/federal laws, regulations, and policies
  • Strong oral, written, and interpersonal communication skills
  • Strong time management and organizational skills