Nagarro

Data Engineer – Snowflake, dbt

Nagarro

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇵🇱 Poland

Visit company website
AI Apply
Apply

Job Level

Mid-LevelSenior

Tech Stack

AWSCloudETLPythonSQL

About the role

  • Design, develop, and maintain scalable data pipelines using Snowflake and dbt
  • Write and optimize advanced SQL queries for performance and reliability
  • Implement ETL/ELT processes to ingest and transform data from multiple sources
  • Develop Python scripts for automation, data processing, and API integrations
  • Build and manage data workflows using AWS services such as Glue, Lambda, S3, and CloudFormation
  • Design and maintain data warehouse models, schemas, and transformations
  • Collaborate with cross-functional teams to understand data requirements and deliver analytical solutions
  • Implement and maintain version control, CI/CD pipelines, and best development practices
  • Monitor, troubleshoot, and optimize data pipelines for performance and cost efficiency

Requirements

  • Strong hands-on experience with Snowflake
  • Advanced SQL proficiency
  • Strong understanding of ETL/ELT concepts and data pipelines
  • Hands-on experience with dbt
  • Solid knowledge of data warehousing concepts, including schema design and data modeling
  • Proficiency in Python for scripting and automation
  • Experience with AWS services (Glue, Lambda, S3, CloudFormation)
  • Familiarity with Git and CI/CD practices
  • Understanding of APIs and CRUD operations
  • Exposure to cloud-native data architectures
Benefits
  • No specific benefits mentioned in the job posting.

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
SnowflakeSQLETLELTdbtPythonAWSdata warehousingdata modelingAPI
Soft skills
collaborationcommunication