
Data Engineer II – ABS Data Analytics, Finance Reporting
GM Financial
full-time
Posted on:
Location Type: Hybrid
Location: Fort Worth • Texas • 🇺🇸 United States
Visit company websiteJob Level
JuniorMid-Level
Tech Stack
AWSAzureCassandraCloudDistributed SystemsETLGoogle Cloud PlatformGradleHadoopHDFSKafkaKubernetesMavenMongoDBNoSQLOraclePythonRedisSparkSQLSubversionTerraformWebpack
About the role
- The ABS Data Engineer II is a critical technical role within the GMF North America Securitization and Conduit Reporting team.
- This position will be helping ABS reporting team in building and maintaining reliable and scalable data pipelines.
- This position will leverage your expertise in Python, SQL, and Azure cloud technologies to extract, transform, and load data efficiently, enabling seamless data access and analysis for accounting business users.
- This position involves a high level of coordination with other departments and third-party software vendors.
- Work with internal business partners to identify, capture, collect, and format data from external sources, internal systems, and the data warehouse to extract features of interest.
- Contribute to the evaluation, research, experimentation efforts with batch and streaming data engineering technologies in a lab to keep pace with industry innovation.
- Work with data engineering related groups to inform on and showcase capabilities of emerging technologies and to enable the adoption of these new technologies and associated techniques.
- Coordinate with Privacy Compliance to ensure proper data collection and handling.
- Create and implement business rules and functional enhancements for data schemas and processes.
- Perform data load monitoring and resolution.
- Work with internal business clients to problem solve data availability and activation issues.
Requirements
- Experience with processing large data sets using Hadoop, HDFS, Spark, Kafka, Flume or similar distributed systems
- Experience with ingesting various source data formats such as JSON, Parquet, SequenceFile, Cloud Databases, MQ, Relational Databases such as Oracle
- Experience with Cloud technologies (such as Azure, AWS, GCP) and native toolsets such as Azure ARM Templates, Hashicorp Terraform, AWS Cloud Formation
- Understanding of cloud computing technologies, business drivers and emerging computing trends
- Thorough understanding of Hybrid Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape
- Working knowledge of Object Storage technologies to include but not limited to Data Lake Storage Gen2, S3, Minio, Ceph, ADLS etc
- Experience with containerization to include but not limited to Dockers, Kubernetes, Spark on Kubernetes, Spark Operator
- Working knowledge of Agile development / SAFe Scrum and Application Lifecycle Management
- Strong background with source control management systems (GIT or Subversion); Build Systems (Maven, Gradle, Webpack); Code Quality (Sonar); Artifact Repository Managers (Artifactory), Continuous Integration/ Continuous Deployment (Azure DevOps)
- Experience with NoSQL data stores such as CosmosDB, MongoDB, Cassandra, Redis, Riak or other technologies that embed NoSQL with search such as MarkLogic or Lily Enterprise
- Creating and maintaining ETL processes
- Knowledgeable of best practices in information technology governance and privacy compliance
- Experience with Adobe solutions (ideally Adobe Experience Platform, DTM/Launch) and REST APIs
- Troubleshoot complex problems and works across teams to meet commitments
- Excellent computer skills and proficiency in digital data collection
- Ability to work in an Agile/Scrum team environment
- Strong interpersonal, verbal, and writing skills
- Digital technology solutions (DMPs, CDPs, Tag Management Platforms, Cross-Device Tracking, SDKs, etc.)
- Knowledge of Real Time-CDP and Journey Analytics solutions
- Understanding of big data platforms and architectures, data stream processing pipeline/platform, data lake and data lake houses
- SQL experience: querying data and sharing what insights can be derived
- Understanding of cloud solutions such as Google Cloud Platform, Microsoft Azure & Amazon AWS cloud architecture & services
- Understanding of GDPR, privacy & security topics
- 2-4 years of hands on experience with data engineering required
- Bachelor’s Degree in related field or equivalent experience required
Benefits
- Generous benefits package available on day one to include: 401K matching
- bonding leave for new parents (12 weeks, 100% paid)
- tuition assistance
- training
- GM employee auto discount
- community service pay
- nine company holidays.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonSQLHadoopSparkKafkaAzureETLNoSQLContainerizationAgile
Soft skills
interpersonal skillsverbal communicationwriting skillsproblem solvingcoordinationcollaborationdata analysismonitoringtroubleshootingcommitment