Tech Stack
AirflowCloudETLHadoopPySpark
About the role
- Design and implement large-scale data architecture for governing financial crime compliance processes
- Develop and maintain data pipelines and integrate data from various sources
- Ensure data quality, security, and compliance
- Optimize on-premises and cloud computing resources for processing, storage, performance and scalability
- Construct and maintain a modern data platform to counteract money laundering and fraud
- Collaborate with stakeholders to meet data requirements and lead data engineering projects and teams
- Work within the Enterprise Compliance Engineering team, and lead/coordinate onsite/offshore scrum teams
Requirements
- Bachelor’s degree in computer science, Engineering or a related disciple or equivalent work experience
- 8-12 years of experience in data engineering and data modeling
- Experience in developing Data warehouses, Data lakes, Data Marts using big data stacks
- Experience in Data analysis, migration, cleansing, transformation and integration using ETL tools
- Experience in developing data pipelines using PySpark, with strong knowledge in Hadoop, Hive and Impala
- Advanced technical expertise in using Snowflake cloud databases and developing pipelines with Snowpark
- Extensive knowledge of orchestration frameworks like Airflow
- Experience in BI report development is an added advantage
- Domain knowledge in Anti-Money Laundering, Sanctions Screening and Surveillance preferred
- Experience in leading scrum team that works in Onsite/Offshore model
- Does not require sponsorship for employment visa status (including Student Visas) now or in the future, in the country where applying.