
Azure Data Engineer
EXL
full-time
Posted on:
Location Type: Hybrid
Location: Gurugram • India
Visit company websiteExplore more
About the role
- **Role: Azure Data Engineer ****Location: Gurugram**Work Mode: Hybrid ****Key Responsibilities: ****
- Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, **and DBT. **
- Build and maintain data integration workflows from various data sources to **Snowflake. **
- Write efficient and optimized SQL queries for data extraction and transformation. **
- Work with stakeholders to understand business requirements and translate them **into technical solutions. **
- Monitor, troubleshoot, and optimize data pipelines for performance and **reliability. **
- Maintain and enforce data quality, governance, and documentation standards. **
- Collaborate with data analysts, architects, and DevOps teams in a cloud-native **environment. ****Must-Have Skills: ****
- Strong experience with Azure Cloud Platform services. **
- Proven expertise in Azure Data Factory (ADF) for orchestrating and automating **data pipelines. **
- Proficiency in SQL for data analysis and transformation. **
- Hands-on experience with Snowflake and SnowSQL for data warehousing. **
- Practical knowledge of DBT (Data Build Tool) for transforming data in the **warehouse. **
- Experience working in cloud-based data environments with large-scale datasets. **Good-to-Have Skills: **
- Experience with Azure Data Lake, Azure Synapse, or Azure Functions. **
- Familiarity with Python or PySpark for custom data transformations. **
- Understanding of CI/CD pipelines and DevOps for data workflows. **
- Exposure to data governance, metadata management, or data catalog tools. **
- Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. ****Qualifications: ****
- Bachelor’s or master’s degree in computer science, Data Engineering, **Information Systems, or a related field. **
- 4+ years of experience in data engineering roles using Azure and Snowflake. **Key Skills: Azure, Snowflake, SQL, Data Factory, DBT
Requirements
- Bachelor’s or master’s degree in computer science, Data Engineering, **Information Systems, or a related field. **
- 4+ years of experience in data engineering roles using Azure and Snowflake.
- Strong experience with Azure Cloud Platform services.
- Proven expertise in Azure Data Factory (ADF) for orchestrating and automating **data pipelines. **
- Proficiency in SQL for data analysis and transformation.
- Hands-on experience with Snowflake and SnowSQL for data warehousing.
- Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse.
- Experience working in cloud-based data environments with large-scale datasets.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
ETLELTSQLSnowflakeDBTAzure Data FactorySnowSQLPythonPySparkCI/CD