Tech Stack
AWSDistributed SystemsDockerETLJavaKubernetesMicroservicesPythonScalaSparkSpringSpring BootSpringBootTerraform
About the role
- Design and develop scalable, distributed, and resilient software systems.
- Build, test, and optimize Spark-based ETL pipelines in Scala or Python.
- Collaborate with researchers and data scientists to deliver production-ready data and models.
- Participate in architecture discussions, peer reviews, and deployment processes.
- Define and monitor technical and operational metrics to ensure system health and performance.
- Continuously identify areas for optimization and efficiency improvements across services.
- Mentor junior engineers and contribute to engineering best practices.
- Drive innovation by staying current with emerging tools and technologies.
Requirements
- 5+ years of software engineering experience
- Strong programming skills in Scala, Python, or Java
- Spark experience for building and optimizing ETL jobs
- Experience with Databricks (DBX) – workflows, data development, and debugging
- Solid understanding of microservices architecture and distributed systems
- Experience with AWS, Docker, and Kubernetes
- Familiarity with Spring Boot, Terraform, and infrastructure-as-code
- Strong analytical and problem-solving skills
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
ScalaPythonJavaSparkETLmicroservices architecturedistributed systemsAWSDockerKubernetes
Soft skills
analytical skillsproblem-solving skillsmentoringcollaborationinnovation