Tech Stack
Amazon RedshiftAWSETLPythonScalaSparkSQL
About the role
- Design and develop enterprise-wide big data solutions using Spark, Scala, AWS Glue, Lambda, SNS/SQS, CloudWatch.
- Develop applications in Scala/Python and implement ETL/ELT processes and frameworks.
- Create integration and application technical design documentation.
- Conduct peer-reviews of functional design documentation and perform code reviews.
- Complete development, configuration, test cases and unit testing.
- Resolve complex defects during testing phases and identify root causes.
- Support and execute performance testing and production support/troubleshooting and tuning.
- Ensure best practices are followed during all phases of the project.
- Communicate issues, incidents, and updates to management and stakeholders in a timely manner.
- Work hybrid, collaborating with team members and leading where required.
Requirements
- At least 6+ years of relevant experience in design, development, complete end-end design of enterprise-wide big data solution.
- Experience in designing & developing a big data solution using Spark, Scala, AWS Glue, Lambda, SNS/SQS, Cloudwatch is a must.
- Strong Application development experience in Scala/Python.
- Strong Database SQL experience, preferably Redshift.
- Experience in Snowflake is an added advantage.
- Experience with ETL/ELT process and frameworks is a must.
- Familiar with Git repo and CI/CD deployment process.
- Production support experience: troubleshooting production environment and tuning environments.
- Effective communication with management and stakeholders about issues, incidents, and updates.
- Capable of working independently and collaboratively, taking the lead.
- Experience creating technical design documentation and conducting peer-reviews.
- Experience in development, configuration, unit testing, code reviews, and performance testing.