Provide production support and develop ETL mappings, workflows for data warehouse environments
Administer and support Informatica PowerCenter, Data Quality, Webservices, PowerExchange, Informatica Cloud (IDMC), DVO
Implement and maintain version control (Bitbucket/GitHub) and scheduling (AutoSys) integrations
Implement security best practices for Data Integration Platforms and enforce Secure by Design principles
Develop and maintain reusable automation scripts for monitoring INFA services, CPU/Memory, volume group/SAN, Network, backups and workflow/taskflow start scripts
Perform performance tuning of ETL code and platform components; identify and remediate bottlenecks
Use UNIX shell scripting and tools (pmcmd/pmrep) to manage and automate tasks
Create and manage TNS/odbc/DB2 entries and connectivity to databases including RDS
Work with AWS services (Glue, S3, EKS, Data Pipeline, Step Functions, RedShift, EMR) and integrate with Informatica and Spark
Build ETL code using Apache PySpark and work on big data processing platforms like Apache Spark
Collaborate with development, enterprise architects and onshore/offshore teams for tool assessments, architecture discussions, and product roadmaps
Create and maintain documentation in Confluence and Jira and provide support/troubleshooting to stakeholders
Requirements
At least 7+ years of experience in Data Integration products
Familiarity with data processing systems such as Apache Spark and Pyspark
Strong proficiency with AWS cloud services including Glue, S3, EKS, Data Pipeline, Step Functions, RedShift, Amazon EMR
Hands-on experience with data integration platforms like Informatica (PowerCenter, Data Quality, Webservices, PowerExchange, Informatica Cloud/IDMC, DVO)
Expertise in Informatica Administration tasks: Installation, Configuration of domains, Code Promotions/Migrations, managing users/groups/privileges, backups and restores
Hands-on experience developing ETL mappings, workflows and providing production support for critical data warehouse environments
Familiarity with version control tools like Bitbucket and GitHub
Familiarity with scheduling tools like AutoSys
Hands-on experience implementing security for Informatica environments/domains
Experience implementing reusable automation scripts for monitoring, backups, workflow starts, and server monitoring
Experience with performance tuning ETL code and identifying bottlenecks
Expertise in UNIX shell scripting and using pmcmd/pmrep
Experience creating TNS/odbc entries and DB2 entries and connecting to RDS
Experience migrating complex applications between environments using Informatica deployment groups, folder/XML migration
Experience working in 24*7 support environments using ITIL processes
Experience leading onshore/offshore teams and large data migration/product upgrade projects
Demonstrated experience designing and implementing data platform integration infrastructures using Well-Architected Framework
Excellent communication, documentation, and problem-solving skills
Bachelor's degree in Computer Science or a closely-related discipline, or equivalent combination of formal education and experience
Informatica Professional Certification - Preferred
Visa sponsorship/support is not anticipated for this position (based on business needs)
Benefits
The typical base pay range for this role is between $145K - $182K depending on location and experience
May be eligible for certain discretionary performance-based bonus and/or incentive compensation
Comprehensive health and wellness benefits
Retirement plans
Educational assistance and training programs
Income replacement for qualified employees with disabilities
Paid maternity and parental bonding leave
Paid vacation, sick days, and holidays
Total Rewards program providing a competitive benefits package
Hybrid work arrangement: work at MUFG office or client sites four days per week and remote one day
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.