Salary
💰 $94,000 - $156,200 per year
Tech Stack
ApacheCloudGoogle Cloud PlatformHadoopOraclePython
About the role
- Contribute to the development of scalable systems on platforms like Apache Hadoop, Google Cloud Platform (GCP), and SAP HANA
- Participate in technical discussions and assist in code reviews
- Learn and apply engineering best practices under guidance of senior team members
- Write maintainable and efficient code to support secure and highly available systems
- Assist in defining, developing, integrating, testing, documenting, and supporting technical solutions
- Support the development of high-performance data pipelines handling large-scale workloads
- Collaborate with the data team to analyze datasets, contribute to model building, and assist in creating reports and visualizations using Google Cloud tools
- Gain hands-on experience deploying large-scale data solutions within an enterprise environment
- Follow established project management frameworks and adhere to development processes under supervision
- Support the design, development, troubleshooting, and debugging of software programs for databases, applications, tools, and networks
Requirements
- Minimum of 2 years of relevant work experience
- Bachelor's degree or equivalent experience
- Experience with enterprise-grade database technologies and large-scale data platforms (preferred)
- Experience designing mission-critical data infrastructure at petabyte scale (preferred)
- Experience with GCP cloud-native DBs like BigTable and Spanner, Cloud networking, and architecture (preferred)
- Experience with managing Oracle databases (preferred)
- Proficiency in automation and scripting languages (Python, Bash) (preferred)
- Demonstrated leadership in cross-functional collaboration and high-impact delivery (preferred)
- Experience with generative AI tools for database optimization (preferred)