Salary
💰 $70 - $80 per hour
Tech Stack
AWSCloudDynamoDBJavaJenkinsKafkaPythonSDLCTerraform
About the role
- Develop event driven Kafka services using Java with a focus on containers running in AWS
- Develop cloud-native code intended for the AWS environment
- Creates automated Unit & Integration Tests with an eye toward quality testing and 80+% code coverage
- Perform code reviews for other peers in the team
- Works with users and QA to perform Quality Assurance testing of applications and resolves all reported defects/issues.
- Work in an Agile development environment, accurately estimating story points, meeting sprint deadlines
- Work to improve DevOps process to maximize CI/CD efficiency. Adjust Terraform code as needed for deployments.
- Ability to convey complex technical concepts to non-technical staff in a way that can be clearly understood by all
- Support the next generation archival and search solution for wealth and asset managers
- Join the BRCC Business Services engineering team supporting cloud-based archival solution and services
Requirements
- Have a high degree of understanding and command over the Java language
- 5+ years AWS experience
- 5+ years working with Kafka or other streaming / event platforms
- 1+ years working with Infrastructure as Code tools such as Terraform or Cloudformation
- Must have previously worked on one or more projects that involve a full SDLC implementation
- Ability to work with GIT
- Demonstrated ability to work well in a team environment
- 1+ years using Python
- Experience with Lambda, DynamoDB, Cloudwatch, Glue, SQS, Cognito IAM, and API Gateway
- Ability to document APIs using Swagger (or similar solution)
- Understanding of IAM roles and the concept of minimum privilege
- Experience with automated CI/CD pipelines using Jenkins
- Any experience with AI