
Senior Software Engineer – Interop
Abacus Insights
full-time
Posted on:
Location Type: Remote
Location: Remote • 🇺🇸 United States
Visit company websiteJob Level
Senior
Tech Stack
AirflowAWSAzureCloudETLGoGoogle Cloud PlatformJavaPySparkPythonSparkSQL
About the role
- Develop and implement virtual, high performant cloud solutions which conform to US healthcare security standards by leveraging a broad level of experience across platforms like AWS, Azure, Databricks, realized through analytical work with end users, product managers and software/data architects
- Build data processing pipelines leveraging AWS/Azure, Airbyte, and Databricks
- Build pipelines and API ecosystems around FHIR data stores (i.e. Firely, AWS HealthLake). Familiarity with FHIR systems and how to ingest and read data via FHIR based APIs
- Write PySpark, Python, and SQL code to meet requirements for clients or internal teams
- Deploy code using CI/CD frameworks
- Be able to critically analyze and review peer-authored designs and code
- Employ exceptional problem-solving skills, with the ability to see and solve issues before they affect business productivity
- Troubleshoot client reported incidents, identify root cause, fix, and document problems, and implement preventive measures
- Optimize the performance and cost of data processing workflows
- Demonstrate deep working knowledge of Airflow, ETL (Extract, Transform, Load) processes, APIs and data connectors and troubleshoot issues related to each
- Drive the technical excellence of a team, mentor other team members and lead by example
- Identify area of technical investments, work with stakeholders to prioritize them onto the roadmap and lead efforts to implement such investments.
Requirements
- Bachelor's degree, preferably in Computer Science, Computer Engineering, or related IT discipline
- 5+ years of commercial software development experience
- 3+ years of building or using cloud services in a production environment (AWS, Azure, GCP, etc.)
- 2+ years experience working with FHIR standard and FHIR databases
- 2+ years of building ETL data pipelines at scale with Spark/PySpark and Databricks
- Strong programming skills (Python, Java, or other OOP Languages)
- Go-getter with self-starter mindset
- Someone who stays current with emerging technologies and development techniques
- Excellent oral and written communication skills; strong analytical, problem solving, organization and prioritization skills.
Benefits
- Equal Opportunity Employer
- diverse teams and providing equal employment opportunities to all applicants.
- comprehensive policies and practices are met.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonPySparkSQLETLAirflowAWSAzureDatabricksFHIRAPI
Soft skills
problem-solvinganalytical skillscommunication skillsorganization skillsprioritization skillsmentoringleadershipcritical analysisself-starter mindsetadaptability
Certifications
Bachelor's degree in Computer ScienceBachelor's degree in Computer Engineeringrelated IT discipline