R1 RCM

Senior Data Platform Scala Architect, Multiple Positions

R1 RCM

full-time

Posted on:

Location Type: Remote

Location: Remote • Utah • 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $158,787 - $202,867 per year

Job Level

Senior

Tech Stack

AirflowApacheAzureCloudETLNoSQLScalaSparkSQL

About the role

  • Create data models and distributed system data architecture using Scala to support healthcare revenue management.
  • Apply Subject Matter Expert knowledge in healthcare revenue cycle data from electronic medical record systems to monitor and optimize data architecture performance.
  • Identify and resolve bottlenecks to ensure effective data processing and retrieval.
  • Implement data architecture strategy for a centralized data lake solution needed to facilitate reporting and analytics across all applications within the company.
  • Act as a liaison between technology and business teams by translating complex data requirements into actionable architectural designs that support business goals.
  • Develop and maintain comprehensive data models that enable new GenAI initiatives focusing on health care revenue cycle billing data.
  • Oversee automation of manual data pipeline operations using AI, migrate largescale data lakes, and support data transformation.
  • Design and oversee data Extract, Transform, Load (ETL) solutions to consolidate data from different sources, ensuring data consistency, data quality, and accessibility.
  • Implement proof of concept data pipelines and tools to drive execution.
  • Maintain comprehensive documentation of data architecture design and processes.
  • Provide training to team members and stakeholders on best practices related to data management.

Requirements

  • Must have a Bachelor’s degree or foreign equivalent in Computer Science, Information Management, Information Technology, Computer Information Science, or a related field, and 8 years of post-bachelor’s, progressive related work experience; OR a Master’s degree or foreign equivalent in Computer Science, Information Management, Information Technology, Computer Information Science, or a related field, and 5 years of related work experience.
  • Of the required experience, must have 3 years of experience with the following:
  • ◦ Scala and Spark development for data transformation;
  • ◦ Using Apache Airflow orchestration tool to design and create production-ready ETL pipeline jobs;
  • ◦ Designing data models with both SQL and NoSQL databases;
  • ◦ Designing distributed system data architecture using Scala;
  • ◦ Assuring data quality;
  • ◦ Migrating a data lake that serves enterprise customers and hosts multi-petabyte data of various types including files and pictures;
  • ◦ Collaborating with business stakeholders, engineers, and product managers to understand data architecture requirements;
  • ◦ Mentoring junior staff responsible for delivering critical data architecture features.
  • Of the required experience, must have 1 year of experience with the following:
  • ◦ Serving as a subject matter expert in health care revenue cycle data from electronic medical record systems, including Epic or Cerner;
  • ◦ Developing data platform services using azure native resources in an Azure cloud environment;
  • ◦ Building data lineage and tools to validate data that can be leveraged by users to handle hundreds of data pipelines that align with industry standards and meet compliance requirements;
  • ◦ Automating manual data pipeline operations by leveraging AI, including large language models.
  • Travel to various locations throughout the US required up to 5% of the time.
  • Telecommuting permitted up to 5 days a week, depending on business need.
Benefits
  • Medical
  • Dental
  • Vision
  • 401k matching
  • Paid time off [amount depends on years of service]
  • Paid Parental leave
  • 8 paid holidays per year
  • Disability coverage
  • Tuition reimbursement
  • Health savings account
  • Flexible spending account
  • Wellness benefits
  • Life insurance
  • Accidental death and dismemberment insurance.

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
ScalaSparkETLdata modelingSQLNoSQLdata architecturedata qualitydata transformationdata pipeline automation
Soft skills
collaborationmentoringcommunicationproblem-solvingtraining