GM Financial

Data Engineer II – Digital Technology

GM Financial

full-time

Posted on:

Location Type: Hybrid

Location: IrvingTexasUnited States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Contribute to the evaluation, research, experimentation efforts with batch and streaming data engineering technologies in a lab to keep pace with industry innovation
  • Work with data engineering related groups to inform on and showcase capabilities of emerging technologies and to enable the adoption of these new technologies and associated techniques
  • Contribute to the definition and refinement of processes and procedures for the data engineering practice
  • Work closely with data scientists, data architects, ETL developers, other IT counterparts, and business partners to identify, capture, collect and format data from the external sources, internal systems, and the data warehouse to extract features of interest
  • Code, test, deploy, monitor, document and troubleshoot data engineering processing and associated automation

Requirements

  • Experience with Adobe solutions (ideally Adobe Experience Platform, XDM, RTCDP DTM/Launch) and REST APIs
  • Digital technology solutions (DMPs, CDPs, Tag Management Platforms, Cross-Device Tracking, SDKs, etc.) Knowledge of Real Time-CDP and Journey Analytics solution
  • SQL experience: querying data and sharing what insights can be derived
  • Working knowledge of Agile development /SAFe, Scrum and Application Lifecycle Management.
  • Experience with ingesting various source data formats such as JSON, Parquet, SequenceFile, Cloud Databases, MQ, Relational Databases such as Oracle.
  • Experience with Cloud technologies (such as Azure, AWS, GCP) and native toolsets.
  • Understanding of cloud computing technologies, business drivers and emerging computing trends.
  • Thorough understanding of Hybrid Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape.
  • Working knowledge of Object Storage technologies to include but not limited to Data Lake Storage Gen2, S3, Minio, Ceph, ADLS etc.
  • Strong background with source control management systems; Build Systems (Maven, Gradle, Webpack); Code Quality (Sonar); Artifact Repository Managers (Artifactory), Continuous Integration/ Continuous Deployment (Azure DevOps).
  • Experience with processing large data sets using Hadoop, HDFS, Spark, Kafka, Flume or similar distributed systems.
  • Experience with NoSQL data stores such as CosmosDB, MongoDB, Cassandra, Redis, Riak or other technologies that embed NoSQL with search such as MarkLogic or Lily Enterprise.
  • Creating and maintaining ETL processes
  • Knowledgeable of best practices in information technology governance and privacy compliance
  • Troubleshoot complex problems and work across teams to meet commitments.
  • Excellent computer skills and proficiency in digital data collection.
  • Ability to work in an Agile/Scrum team environment
  • Strong interpersonal, verbal, and writing skills.
  • Understanding of big data platforms and architectures, data stream processing pipeline/platform, data lake and data lake houses
  • Understanding of cloud solutions such as Google Cloud Platform, Microsoft Azure & Amazon AWS cloud architecture & services
  • Understanding of GDPR, privacy & security topics
  • Strong in the use of Microsoft Office software, data querying platforms (Databricks is a plus) and statistical programming tools such as Python
  • 2-4 years of hands-on experience with data engineering required
  • Bachelor’s degree in related field or equivalent experience required
Benefits
  • Generous benefits package available on day one to include: 401K matching, bonding leave for new parents (12 weeks, 100% paid), tuition assistance, training, GM employee auto discount, community service pay and nine company holidays.

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
SQLHadoopHDFSSparkKafkaETLPythonAdobe Experience PlatformREST APIsNoSQL
Soft skills
troubleshootinginterpersonal skillsverbal communicationwriting skillsAgileScrumcollaborationproblem-solvingcommitmentdigital data collection