Cotiviti

Data Integration Engineer, Risk Adjustment

Cotiviti

full-time

Posted on:

Origin:  • 🇺🇸 United States

Visit company website
AI Apply
Manual Apply

Job Level

Mid-LevelSenior

Tech Stack

AirflowAWSCloudDockerEC2KubernetesLinuxPythonSQLTerraform

About the role

  • Edifecs/Cotiviti/Cotiviti is seeking a Data Integration Engineer to join our innovative software teams.
  • Responsible for onboarding customers to the Risk Adjustment workflow applications.
  • Work with platform engineering, product, and implementation teams.
  • Build scalable data pipelines that enable workflow applications, reporting, and customer confidence.
  • Strong, hands-on technical expertise in big data technologies and communication abilities with non-tech and technical audiences.
  • Design and develop data flows and extraction.
  • Integrate client’s data into the Edifecs/Cotiviti/Cotiviti Risk Adjustment.
  • Work with relational databases and Python.
  • Implement open-source standards for data.
  • Develop a new framework for data ELT jobs to scale implementation, monitoring, and.
  • Build and scale automation that orchestrate complex.
  • Ability to support existing processes while leading efforts to redefine the data.

Requirements

  • Bachelor’s degree (or foreign equivalent) in Computer Science, Computer Information Systems, Computer Engineering, or a related field.
  • Minimum 5 years of progressive, post-baccalaureate experience in product support within the healthcare industry, including:
  • 5 years of experience reading and writing relational database queries using T-SQL or ANSI-SQL.
  • 5 years of experience working with Linux platforms such as RedHat, CentOS, SUSE/SLES, Ubuntu, etc.
  • 5 years of strong analytical and troubleshooting experience.
  • 4 years of experience modifying and supporting code or scripting development using Bash/Python.
  • Experience managing and troubleshooting Linux servers in production.
  • Working knowledge of source control systems (Git preferred).
  • Familiarity with CI/CD pipelines and Git integration.
  • Ability to maintain Git repositories, branching strategies, and version control best practices.
  • Collaborate with cross-functional teams to deploy, configure, and troubleshoot applications in a Linux environment.
  • Implement and maintain automation scripts to improve efficiency and scalability.
  • Monitor system performance and proactively address issues to prevent downtime.
  • Plan and execute system upgrades, patches, and migrations.
  • Experience designing and developing data flows and extraction processes.
  • Experience integrating client data into workflow applications.
  • Experience with open-source data quality tools (e.g., Great Expectations).
  • Experience developing scalable ELT pipelines using tools such as Argo Workflows, cron, Airflow, dbt.
  • Experience building reusable data methods and automation for complex workflows.
  • Experience supporting existing processes while contributing to data strategy improvements.
  • Plus: Experience with container deployment platforms and tools such as Kubernetes, Docker, Helm, and Terraform.
  • Plus: AWS Cloud experience with services like EC2, RDS, SQS, IAM, and S3.
  • Excellent spoken and written English communication skills.
  • Strong customer interaction and problem-solving abilities.
  • Customer-focused with a proactive and responsive approach.
  • Ability to diagnose customer-reported problems and recommend solutions within agreed SLAs.
  • Creative thinking to understand complex problems and communicate them to non-technical audiences.
  • Willingness to participate in on-call rotations for critical system issues.
  • Ability to document system configurations, operational procedures, and troubleshooting guides.