
Senior Data Engineering Manager – GCP Frameworks
Wells Fargo
full-time
Posted on:
Location Type: Office
Location: Iselin • New Jersey • North Carolina • United States
Visit company websiteExplore more
Salary
💰 $139,000 - $260,000 per year
Job Level
About the role
- Manage, coach, and grow multiple agile teams (data engineering, platform engineering, SRE/DevOps, QA) to deliver high-quality, resilient data capabilities; build a culture of talent development, engineering excellence, psychological safety, and continuous improvement
- Define and drive the roadmap for GCP-based data platforms (BigQuery, Dataflow/Apache Beam, Pub/Sub, Dataproc/Spark, Cloud Storage, Cloud Composer/Airflow, Dataplex, Data Catalog)
- Lead the migration of legacy data pipelines, warehouses, and integration workloads to GCP (including CDC, batch & streaming, API-first data products, and event-driven architectures)
- Partner with enterprise, data, and security architects to align on target state architecture, data modeling (dimensional, Data Vault), and domain-driven data products; establish and enforce DataOps and DevSecOps practices (CI/CD, IaC/Terraform, automated testing, observability)
- Embed defense-in-depth—VPC Service Controls, private IP, CMEK/Cloud KMS, DLP, IAM least privilege, tokenization, data masking, and lineage; ensure adherence to financial services regulations and standards (e.g., SOX, GLBA, BCBS 239, model governance)
- Define SLOs/SLIs, runbooks, incident response, capacity planning, and performance tuning for BigQuery/Dataflow/Spark workloads; optimize cost and performance via partitioning/clustering, workload management, autoscaling, and right-sizing
- Influence senior technology leaders and business stakeholders; translate business needs into platform roadmaps and measurable outcomes; manage budgets, resource plans, and strategic vendor/partner engagements
- Scale onboarding of lines of business to the platform, including templates, blueprints, guardrails, and self-service developer experience
Requirements
- 6+ years of Data Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
- 3+ years of management or leadership experience
- 3+ years of direct report management experience of multi-disciplinary engineering teams including, but not limited to, assigning tasks, conducting performance evaluations and determining salary adjustments
- 6+ years hands-on in data engineering and platform build-outs using modern stacks and automation
- 4+ years of production experience on GCP with several of: BigQuery, Dataflow/Apache Beam, Pub/Sub, Dataproc/Spark, Cloud Storage, Cloud Composer/Airflow, Dataplex, Data Catalog
- 6+ years of experience with programming and data skills: SQL plus one or more of Python, Java, Scala; solid understanding of data modeling and data quality frameworks
- 4+ years of experience with DevOps/DataOps proficiency: CI/CD, Terraform/IaC, automated testing, observability, GitOps
- 6+ years of experience leading large-scale data platform migrations or modernizations in regulated environments
- 3+ years leading AI/ML and Generative AI data initiatives
Benefits
- Health benefits
- 401(k) Plan
- Paid time off
- Disability benefits
- Life insurance, critical illness insurance, and accident insurance
- Parental leave
- Critical caregiving leave
- Discounts and savings
- Commuter benefits
- Tuition reimbursement
- Scholarships for dependent children
- Adoption reimbursement
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data engineeringGCPBigQueryDataflowApache BeamPub/SubDataprocSparkSQLPython
Soft Skills
managementleadershipcoachingcommunicationinfluencestrategic planningperformance evaluationtalent developmentpsychological safetycontinuous improvement