Tech Stack
AirflowAzureETLNoSQLScalaSQLSSIS
About the role
- Report to the Engineering Manager and participate in planning, design, and implementation of centralized data warehouse solution for data acquisition, ingestion and large data processing.
- Lead the design and development of ETL processes and data integration solutions.
- Develop and maintain ETL workflows using tools such as SSIS, Azure Databricks, SparkSQL or similar.
- Collaborate with data engineers, analysts, and business stakeholders to gather requirements and translate them into technical solutions.
- Optimize ETL processes for performance, scalability, and reliability.
- Conduct code reviews, provide technical guidance, and mentor junior developers.
- Troubleshoot and resolve issues related to ETL processes and data integration.
- Ensure compliance with data governance, security policies, and best practices.
- Document ETL processes and maintain comprehensive technical documentation.
- Stay updated with the latest trends and technologies in data integration and ETL.
Requirements
- Bachelor’s degree in computer science, Information Technology, or a related field.
- 10-12 years of experience in ETL development and data integration.
- Expertise in ETL tools such as SSIS, T-SQL, Azure Databricks or similar.
- Knowledge of various SQL/NoSQL data storage mechanisms and Big Data technologies.
- Experience in Data Modeling.
- Knowledge of Azure Data Factory, Azure Databricks, Azure Data Lake.
- Experience in Scala, SparkSQL, Airflow is preferred.
- Proven experience in data engineering and designing scalable ETL solutions.
- Excellent problem-solving and analytical skills.
- Strong communication and leadership skills.
- Ability to work effectively in a team-oriented environment.
- Experience working with agile methodology.
- Healthcare industry experience preferred.