
Data Engineer – Mainframe to Cloud Modernization
Ocient
full-time
Posted on:
Location Type: Remote
Location: India
Visit company websiteExplore more
Job Level
About the role
- Develop, enhance, and troubleshoot Mainframe batch processes using JCL, Easytrieve, and SAS.
- Build and maintain automation and data processing scripts using Python.
- Support distributed data processing workloads using Apache Spark and the PySpark API.
- Write efficient SQL queries for data extraction, analysis, and transformation.
- Work with Google Cloud Platform (GCP) services - primarily Cloud Storage - for data movement and storage management.
- Collaborate with data analysts, engineers, and business teams to support data initiatives and enhance data workflows.
- Participate in documentation, code reviews, and best practices for data and code quality.
- Investigate data issues, perform root-cause analysis, and implement corrective actions.
Requirements
- 1–2 years of experience in Mainframe technologies:
- JCL, Easytrieve, SAS
- 1–2 years of experience with:
- Python for scripting and automation
- Apache Spark with familiarity in PySpark
- SQL for data manipulation and querying
- Basic working knowledge of GCP, especially:
- Cloud Storage (bucket operations, file upload/download, permissions)
- General GCP console navigation and IAM basics
- Strong analytical thinking, debugging ability, and problem-solving mindset.
- Good communication skills and ability to work effectively in collaborative environments.
Benefits
- Flexible work arrangements
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
JCLEasytrieveSASPythonApache SparkPySparkSQL
Soft Skills
analytical thinkingdebuggingproblem-solvingcommunicationcollaboration