
Explore more
About the role
- Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT.
- Build and maintain data integration workflows from various data sources to Snowflake.
- Write efficient and optimized SQL queries for data extraction and transformation.
- Work with stakeholders to understand business requirements and translate them into technical solutions.
- Monitor, troubleshoot, and optimize data pipelines for performance and reliability.
- Maintain and enforce data quality, governance, and documentation standards.
Requirements
- Bachelor’s or master’s degree in computer science, Data Engineering, Information Systems, or a related field.
- 5+ years of experience in data engineering roles using Azure and Snowflake.
- Strong experience with Azure Cloud Platform services.
- Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines.
- Proficiency in SQL for data analysis and transformation.
- Hands-on experience with Snowflake and SnowSQL for data warehousing.
- Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse.
- Experience working in cloud-based data environments with large-scale datasets.
- Good-to-Have Skills: Experience with Azure Data Lake, Azure Synapse, or Azure Functions.
- Familiarity with Python or PySpark for custom data transformations.
- Understanding of CI/CD pipelines and DevOps for data workflows.
- Exposure to data governance, metadata management, or data catalog tools.
- Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus.
Benefits
- Hybrid work environment
- Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
ETLELTSQLAzure Data FactorySnowflakeDBTSnowSQLPythonPySparkCI/CD
Soft skills
communicationstakeholder managementtroubleshootingdata quality enforcementdocumentation