U.S. Bank

Senior Cloud Data Engineer – Multi-Cloud Data Platforms

U.S. Bank

full-time

Posted on:

Location Type: Hybrid

Location: IrvingIllinoisMinnesotaUnited States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $119,765 - $140,900 per year

Job Level

About the role

  • Design, build, and maintain cloud-native data pipelines and data products across Azure and AWS using Databricks and Snowflake.
  • Lead and contribute to the modernization and migration of on‑prem and legacy data platforms to cloud-based solutions.
  • Implement batch and streaming data processing patterns using Spark and cloud-native services.
  • Partner with data governance, security, and risk teams to ensure data products comply with enterprise governance, data privacy, and regulatory requirements.
  • Enable secure data sharing and access patterns across domains and platforms using appropriate controls.
  • Define and promote data engineering best practices, including CI/CD, testing, observability, performance tuning, and cost optimization.
  • Collaborate with product owners and analytics teams to translate business requirements into well-modeled, high-quality datasets.
  • Work closely with cloud and security architects to implement secure, scalable, and resilient data solutions.
  • Support and mentor junior engineers through design reviews, code reviews, and technical guidance.

Requirements

  • Bachelor’s degree, or equivalent work experience
  • Three to five years of relevant experience
  • 8+ years of experience in data engineering, with significant experience on cloud platforms.
  • Proven hands-on experience building and operating data solutions in Azure and/or AWS.
  • Strong experience delivering production-grade data pipelines and data products.
  • Solid understanding of data governance, data quality, and security concepts in regulated environments.
  • Excellent communication skills and ability to collaborate across engineering, product, and governance teams.
  • Experience with data architecture and platform design in large enterprises.
  • Strong hands-on experience with Azure Data Platform services, including: Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics (or Fabric equivalent experience)
  • Experience with AWS data services, such as AWS Glue, S3, and event-driven integrations.
  • Deep experience with Databricks (Spark, Delta Lake, performance tuning).
  • Strong working knowledge of Snowflake, including data modeling, ingestion patterns (e.g., Snowpipe), and data sharing.
  • Expertise in Apache Spark for large-scale data processing.
  • Experience building batch and near-real-time data pipelines.
  • Strong SQL skills and experience with dimensional and analytical data modeling.
  • Experience designing reusable, domain-oriented data products.
  • Experience with API-based integrations (REST; familiarity with SOAP and GraphQL is a plus).
  • Hands-on experience integrating with API gateways.
  • Understanding of messaging and streaming platforms such as Kafka, MQ, AWS SQS, or RabbitMQ.
  • Strong understanding of IAM, RBAC, OAuth 2.0, TLS/mTLS, and JWT.
  • Experience implementing secure data access patterns in cloud environments.
  • Familiarity with data cataloging, lineage, and metadata management concepts.
  • Experience enabling self-service analytics and BI using tools such as Power BI, Tableau, or equivalent.
  • Support AI initiatives through the data platform and data products.
  • Prior experience in financial services or other highly regulated industries.
  • Professional certifications in Microsoft Azure and/or AWS.
  • Strong problem-solving skills and a track record of delivering scalable, efficient data solutions.
  • Master’s degree in a relevant technical field.
Benefits
  • Healthcare (medical, dental, vision)
  • Basic term and optional term life insurance
  • Short-term and long-term disability
  • Pregnancy disability and parental leave
  • 401(k) and employer-funded retirement plan
  • Paid vacation (from two to five weeks depending on salary grade and tenure)
  • Up to 11 paid holiday opportunities
  • Adoption assistance
  • Sick and Safe Leave accruals of one hour for every 30 worked, up to 80 hours per calendar year unless otherwise provided by law
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringcloud-native data pipelinesdata productsbatch processingstreaming data processingdata modelingSQLAPI integrationsApache Sparkdata architecture
Soft Skills
communication skillscollaborationmentoringproblem-solvingleadership
Certifications
Microsoft Azure certificationAWS certification