Snowflake

Data Engineer

Snowflake

full-time

Posted on:

Origin:  • 🇮🇳 India

Visit company website
AI Apply
Apply

Job Level

Mid-LevelSenior

Tech Stack

AirflowApacheAWSAzureCloudETLGoogle Cloud PlatformPythonSQL

About the role

  • Design, build, and launch production-ready data models and data pipelines that scale effectively
  • Implement enterprise-grade data governance frameworks and maintain data quality standards
  • Develop and optimize data ingestion processes from various enterprise needs
  • Create efficient data transformation workflows to prepare data for business intelligence ML-AI and Data Products.
  • Partner with data scientists, product managers, and business stakeholders to understand and address data requirements
  • Work closely with business application owners to ensure seamless automation and data management processes
  • Participate actively in Agile ceremonies including daily standups, sprint planning, sprint reviews, and retrospectives
  • Build strong relationships with stakeholders at all levels, from executive to operational teams
  • Lead quality assurance efforts by defining testing strategies and identifying potential risks
  • Ensure timely identification, tracking, and resolution of technical issues
  • Build deep data expertise and take ownership of data quality for assigned business domains
  • Align development efforts with the product roadmap when building tools and solutions
  • Mature requirements gathering processes and follow Agile methodologies for data product development
  • Proactively adapt to changing business requirements while maintaining solution integrity

Requirements

  • Bachelor's degree in Computer Science, Information Systems, or related field (preferably with emphasis on system design and database warehousing/modeling), or equivalent experience
  • 3-5 years of direct experience developing and managing data pipelines for ingestion and transformation
  • Demonstrated expertise with modern cloud data infrastructure and data modeling techniques
  • Experience with data pipeline development, ETL/ELT processes, and data warehousing concepts
  • Experience with SQL
  • Experience with Python
  • Preferred: Knowledge of Snowflake, Apache Airflow, and dbt (data build tool)
  • Desired: Cloud infrastructure (AWS, Azure, or GCP)
  • Understanding of data governance principles and best practices
  • Excellent written and verbal communication skills to effectively convey technical concepts
  • Strong attention to detail and results-oriented mindset
  • Ability to adapt quickly to changing requirements and priorities
  • Commitment to exceptional customer service and quality deliverables
  • Collaborative approach to problem-solving with cross-functional teams
  • Self-motivated with the ability to work independently and as part of a team
  • Effective time management and organizational skills
  • Authorization to work in the country to which you are applying; sponsorship will be queried in application