GM Financial

Data Engineer III – Data & AI Enablement

GM Financial

full-time

Posted on:

Location Type: Hybrid

Location: IrvingTexasUnited States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design and build scalable data pipelines and workflows on Azure Databricks, leveraging Delta Lake, Unity Catalog, and Medallion (Bronze/Silver/Gold) Lakehouse architecture.
  • Lead end-to-end migration of legacy SAS analytical workloads to Databricks/PySpark, including SAS macro translation, data validation, and output reconciliation.
  • Drive migration of Oracle databases and stored procedures to Azure-native services (Azure SQL, Synapse Analytics, or Delta Lake), ensuring data fidelity and business continuity.
  • Architect and implement ELT/ETL frameworks supporting batch and near-real-time data ingestion from diverse financial source systems.
  • Design dimensional models, data vault patterns, and lakehouse table structures optimized for financial analytical and regulatory reporting workloads.
  • Build and maintain data pipelines that power AI/ML models, including feature engineering, training data preparation, and inference feeds using Databricks MLflow and Feature Store.
  • Implement LLM and RAG (Retrieval-Augmented Generation) pipeline patterns for internal analytics and tooling where applicable.
  • Design and consume REST APIs for data ingestion, orchestration, and data product exposure; build integrations with third-party financial data providers.
  • Implement event-driven and streaming data patterns using Azure Event Hubs, Service Bus, and Kafka.
  • Enforce data governance policies in Unity Catalog: column-level security, row-level filtering, PII masking, and audit logging.
  • Implement data quality frameworks with automated alerting and lineage tracking.
  • Ensure all data solutions comply with applicable financial regulations: SOX, BCBS 239, GDPR/CCPA, CCAR, DFAST, Basel III/IV.
  • Apply data security controls including encryption, Azure Key Vault management, Private Endpoints, VNet integration, and Managed Identity.
  • Define engineering standards and best practices; lead code reviews and technical design sessions.
  • Mentor junior and mid-level engineers through pairing, reviews, and knowledge sharing.
  • Produce technical documentation including design docs, runbooks, and Architecture Decision Records (ADRs).

Requirements

  • Bachelor's degree required in Computer Science, Information Systems, Data Science, Statistics, Mathematics, Engineering, or a related quantitative field or equivalent experience required.
  • Master's degree strongly preferred in Computer Science, Data Engineering, Data Science, Applied Mathematics, or a related discipline.
  • 4-6 years of experience in data engineering or a related field required
  • Experience with Big Data technologies required
  • 4-6 years of professional experience in data engineering, data platform development, or a closely related data role.
  • 3+ years of hands-on experience with Azure Databricks (PySpark, Delta Lake, Workflows, Unity Catalog).
  • 3+ years of cloud data engineering experience on Microsoft Azure (ADF, ADLS Gen2, Synapse, Azure SQL, Event Hubs, Key Vault).
  • 2+ years of proven experience in SAS-to-cloud migration — translating SAS PROC SQL, macros, and data steps to PySpark or SQL equivalents.
  • 2+ years of experience leading Oracle database migration to Azure cloud targets.
  • 2+ years of experience working in a regulated financial services environment (banking, capital markets, insurance, asset management, or fintech).
  • Demonstrated experience designing and consuming REST APIs and building event-driven integration patterns.
  • Hands-on experience with AI/ML data pipeline development, including feature stores, training data pipelines, and MLflow-based workflows.
  • Proven track record delivering production-grade data systems in complex enterprise environments.
Benefits
  • Generous benefits package available on day one to include: 401K matching
  • bonding leave for new parents (12 weeks, 100% paid)
  • tuition assistance
  • training
  • GM employee auto discount
  • community service pay
  • nine company holidays
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringAzure DatabricksPySparkDelta LakeETLELTdata validationdata governancedata quality frameworksREST APIs
Soft Skills
leadershipmentoringtechnical documentationcollaborationcommunication
Certifications
Bachelor's degreeMaster's degree