Medsuite Inc

Snowflake Data Engineer

Medsuite Inc

full-time

Posted on:

Origin:  • 🇮🇳 India

Visit company website
AI Apply
Manual Apply

Job Level

Mid-LevelSenior

Tech Stack

ApacheAzureCloudETLKafkaMatillionPythonSparkSQL

About the role

  • Participate in the design, development, and maintenance of a scalable Snowflake data solution serving our enterprise data & analytics team.
  • Implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake and related technologies.
  • Optimize Snowflake database performance.
  • Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and software engineers, to define and implement data solutions.
  • Ensure data quality, integrity, and governance.
  • Troubleshoot and resolve data-related issues, ensuring high availability and performance of the data platform.
  • Conduct independent data analysis and data discovery to understand existing source systems, fact and dimension data models.
  • Implement an enterprise data warehouse solution in Snowflake and take direction from the Lead Snowflake Data Engineer and Director of Data Engineering.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field.
  • 4+ years of experience in-depth data engineering, with at least 1+ minimum year(s) of dedicated experience engineering solutions in an enterprise scale Snowflake environment.
  • Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques.
  • Strong experience with cloud platforms (preference to Azure) and their data services.
  • Experience in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, or Fivetran.
  • Hands-on experience with scripting languages like Python for data processing.
  • Snowflake SnowPro certification; preference to the engineering course path.
  • Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC).
  • Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming.
  • Familiarity with BI and visualization tools such as PowerBI.
  • Familiarity working in an agile scum team, including sprint planning, daily stand-ups, backlog grooming, and retrospectives.
  • Ability to self-manage medium complexity deliverables and document user stories and tasks through Azure Dev Ops.
  • Personal accountability to committed sprint user stories and tasks.
  • Strong analytical and problem-solving skills with ability to handle complex data challenges.
  • Strong oral, written, and interpersonal communication skills.
  • Ability to read, understand, and apply state/federal laws, regulations, and policies.