Salary
💰 $98,124 - $166,810 per year
Tech Stack
ApacheAWSCloudJavaJenkinsPostgresPythonSpark
About the role
- Migrate data from SAS to Python and modernize data processing stacks
- Design and build software processing pipelines using tools and frameworks in the AWS ecosystem
- Analyze requirements and architecture specifications to create detailed design documents
- Responsible for full cycle software engineering functions (design, implement, test, deploy)
- Work with large scale data sets and prepare big data for analysts and data scientists
- Use SAS, Python and other AWS technologies to build data processing pipelines
- Work with DevOps engineers on CI, CD, and IaC processes; translate specifications into code and design documents
- Perform code reviews and develop processes for improving code quality
- Be proactive about scalability, performance, and availability of systems
- Deploy developed solutions in AWS environment and examine results for accuracy
- Teach others Spark, inform design decisions, and debug runtime problems
- Collaborate across multiple project teams to deliver integrated healthcare reporting solutions for CMS
Requirements
- Bachelor’s degree required (degree in Computer Science or related field preferred)
- 5+ years of high-volume software engineering experience
- 2+ years of experience working in Python
- 2+ years of experience migrating code to a cloud environment
- 2+ years of experience with Agile methodology
- Candidate must be able to obtain and maintain a Public Trust Clearance
- Candidate must reside in the U.S., be authorized to work in the U.S., and all work must be performed in the U.S.
- Candidate must have lived in the U.S. for three (3) full years out of the last five (5) years
- U.S. Citizenship or Green Card is highly prioritized due to federal contract requirements (preferred)
- SAS experience strongly preferred (preferred)
- MS and 5+ years of technical experience (preferred)
- Experience working in the healthcare industry with PHI/PII (preferred)
- Federal Government contracting work experience (preferred)
- Expertise working as part of a dynamic, interactive Agile team (preferred)
- Strong written and verbal communication skills (preferred)
- Prior experience working remotely full-time (preferred)
- Experience with Apache Parquet, Apache Spark, AWS Glue, AWS Athena, Databricks (preferred)
- Familiarity with Python, SAS, PostgreSQL, Jenkins, Java, Git and GitHub, Confluence
- Experience with DevOps practices: CI, CD, and Infrastructure as Code
- Experience deploying solutions in AWS environments