Salary
💰 $138,700 - $203,900 per year
Tech Stack
AirflowAnsibleAWSChefCloudDistributed SystemsDNSDockerGoJavaKafkaKubernetesLinuxPuppetPythonScalaSparkTCP/IPTerraform
About the role
- Design, build, and maintain infrastructure and scalable frameworks to support data ingestion, processing, and analysis.
- Collaborate with stakeholders, analysts, and product teams to understand business requirements and translate them into technical solutions.
- Architect and implement data solutions using modern data technologies such as Kafka, Spark, Hive, Hudi, Presto, Airflow, and cloud-based services like AWS Lakeformation, Glue and Athena.
- Design and implement frameworks and solutions for performance, reliability, and cost-efficiency.
- Ensure data quality, integrity, and security throughout the data lifecycle.
- Stay current with emerging technologies and best practices in big data technologies
- Mentor early in career engineers and contribute to a culture of continuous learning and improvement
Requirements
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in software development or a related field.
- Strong proficiency in programming languages such as Python, Java, or Scala.
- Strong experience with building frameworks for big data technologies such as Spark, Kafka, Hive, and distributed computing systems.
- Experience with AWS technologies at scale
- Solid understanding of software engineering principles, including object-oriented and functional programming paradigms, design patterns, and code quality practices.
- Excellent problem-solving and analytical skills.
- Strong verbal & written communication skills, with the ability to work effectively in a cross-functional team environment.