
ETL Specialist, Apache Hop, Azure, Databricks, Kafka, Spark, Hadoop, Python, Airflow
Synechron
full-time
Posted on:
Location Type: Office
Location: Bengaluru • India
Visit company websiteExplore more
About the role
- Design, develop, and maintain ETL and data integration workflows using Apache Hop.
- Extract data from multiple heterogeneous sources, transform data according to business rules, and load into target systems effectively.
- Optimize ETL pipelines to improve performance and ensure data integrity.
- Implement monitoring, error handling, and alerting mechanisms to maintain pipeline reliability.
- Collaborate with data architects, analysts, and business stakeholders to gather and refine data requirements.
- Leverage cloud platforms such as Azure and Databricks to build scalable data solutions.
- Lead complex data engineering initiatives and provide technical guidance to peers.
- Conduct code reviews and enforce best practices in data engineering and pipeline development.
Requirements
- Minimum of 6 years of experience in ETL development, data integration, or related data engineering roles.
- Proven track record designing and maintaining complex data pipelines using Apache Hop or equivalent tools.
- Experience integrating with data platforms including Kafka, Spark, and Hadoop.
- Demonstrated expertise with cloud technologies, particularly Azure and Databricks.
- Hands-on experience implementing workflow orchestration and automation using Airflow.
- Experience leading technical initiatives and collaborating across multidisciplinary teams.
- Bachelor’s degree in Computer Science, Information Systems, Engineering, or equivalent experience.
Benefits
- Professional certifications in data engineering, cloud platforms, or ETL tools are advantageous.
- Ongoing professional development through training, courses, or certifications is encouraged.
- Flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
ETL developmentdata integrationdata pipelinesdata transformationdata integrityworkflow orchestrationautomationcode reviewsdata engineeringmonitoring and error handling
Soft Skills
collaborationtechnical guidanceleadershipcommunicationproblem-solving
Certifications
Bachelor’s degree in Computer ScienceBachelor’s degree in Information SystemsBachelor’s degree in Engineering