Tech Stack
AWSAzureCloudETLGoogle Cloud PlatformKafkaMongoDBPythonSQL
About the role
- Build and maintain **Kafka pipelines** for claims data ingestion and routing.
- Develop ETL/ELT processes for integrating Amisys, Facets, ABS, and Excelys into Pisces.
- Implement schema validation and ensure data quality across multiple sources.
- Collaborate with BSAs and QA to deliver accurate edits and exclusions.
Requirements
- Location: Coppel, TX and NY (Hybrid with 2-3 days WFO per week)
- Proficiency in **Kafka, Python, SQL** (for ETL and data validation).
- Experience with **cloud-native data platforms** (AWS Glue, Azure Data Factory, GCP Dataflow).
- Familiarity with **MongoDB, Talend, or other integration tools**.
- Strong data modeling, schema design, and performance optimization knowledge.
- Ability to debug data pipeline issues in large-scale environments
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
KafkaPythonSQLETLELTdata validationdata modelingschema designperformance optimizationdebugging