Lead the development of comprehensive AI lifecycle frameworks encompassing activities, risk assessments, and approval processes that enable consistent AI project execution while ensuring safety and compliance with regulatory expectations.
Drive the creation of differentiated governance approaches for various AI scenarios and risk profiles.
Partner with Data & AI organization, CIO, Architecture, SSRE, Legal, Compliance, ERM and Third-Party Risk teams to ensure comprehensive lifecycle coverage.
Evolve Gen AI review board process.
Define and implement multiple governance checkpoints with clear governing criteria for each stage.
Lead the development of comprehensive documentation requirements, reference guides, and alignment to existing policies, standards, and controls.
Integrate AI governance frameworks with Google Cloud Platform services and other enterprise platforms.
Oversee the vendor selection and implementation of an AI Governance system, to instrument the AI life cycle activities and signoffs and manage the inventory and AI portfolio reporting.
Requirements
Bachelor's degree required; Master's degree in Business, Technology, Risk Management, or related field preferred
15+ years of experience in governance functions, risk management, AI\/ML operations or technology program management
5+ years experience with AI\/DS\/ML projects
Demonstrated experience partnering with Legal, Compliance, and Risk Management functions and building governance frameworks
Strong understanding of AI\/ML technologies and associated governance challenges
Benefits
short-term or annual bonuses
long-term incentives
on-the-spot recognition
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
AI lifecycle frameworksrisk assessmentsgovernance approachesAI project executiondocumentation requirementsGoogle Cloud PlatformAI Governance systemAI portfolio reportingAI/ML operationstechnology program management