Salary
💰 $130,000 - $150,000 per year
Tech Stack
CloudETLJavaScriptPythonSQL
About the role
- Own data pipelines (including claims, eligibility, HR, engagement, clinical, and more) connecting clients and vendors
- Translate vendor data layouts into Abett’s standard data models
- Configure pipelines, SFTP connections, mappings, and data quality validations
- Troubleshoot data issues and work directly with client/vendor technical teams to resolve discrepancies
- Contribute to reusable data models and quality frameworks that reduce custom work across clients
- Surface integration pain points and edge cases to inform Product and Engineering roadmaps
- Build and deploy data pipelines using SQL, Python, and cloud-native tools
- Map vendor data feeds to Abett’s standardized data models
- Implement automated checks for data quality, completeness, and timeliness
- Rapidly troubleshoot issues when they arise
- Develop pipeline standards and templates that move from one-off integrations towards standardized connectors
- Partner with Implementation and Client Services counterparts to support kickoff through go-live
- Work closely with customer and vendor technical counterparts to design integrations and resolve data quality issues
- Identify recurring integration patterns and inefficiencies, propose improvements to productized workflows
- Maintain clear technical specifications and implementation notes for reuse and auditability
Requirements
- 4+ years' experience in data engineering, data integration , or solutions engineering roles
- 3+ years’ direct experience working with healthcare, benefits, and/or health insurance data (eligibility, claims, EMR/EHR, Rx)
- Fluency in core healthcare data concepts, including coding terminology, claims management, and enrollment/eligibility relationships
- Strong proficiency in SQL
- Experience with Python for data transformation
- Experience with modern ETL/ELT tools and data platforms, including dbt and Dagster
- Understanding of data exchange protocols (SFTP, APIs, HL7, EDI)
- Proven ability to troubleshoot and resolve complex data quality issues
- Strong communication skills and comfort interfacing with external vendors and clients
- Ability to work independently in ambiguous, evolving environments