Partners with Regions Technology partners to Design, Build, and Maintain the data-based structures and systems in support of Data and Analytics and Data Product use cases
Builds data pipelines to collect and arrange data and manage data storage in Regions’ big data environments
Builds robust, testable programs for moving, transforming, and loading data using cloud-based Big Data tools such as Glue, Step-Functions, Snowflake, Kafka, and Sagemaker
Ensures data is prepared, arranged and ready for each defined business use case
Provides consultation to all areas of the organization that plan to use data to make decisions
Supports any team members in the development of such information delivery and aid in the automation of data products
Requirements
Bachelor’s degree and four (4) years of experience in a quantitative/analytical/STEM field or technical related field
Or Master’s degree and two (2) years of experience in a quantitative/analytical/STEM field
Or Ph.D. in a quantitative/analytical/STEM field
One (1) year of working programming experience in Python, Scala, SQL, and Terraform
One (1) year of working experience in cloud-based Big Data technology such as Elastic Map Reduce (EMR), AWS Glue, BigQuery, or Snowflake
Benefits
Paid Vacation/Sick Time
401K with Company Match
Medical, Dental and Vision Benefits
Disability Benefits
Health Savings Account
Flexible Spending Account
Life Insurance
Parental Leave
Employee Assistance Program
Associate Volunteer Program
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data pipelinesdata storagedata transformationprogrammingPythonScalaSQLTerraformcloud-based Big Datadata analytics