OU Health

Data Engineer III

OU Health

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Lead and serve as the subject matter expert in the design, development and delivery of data pipelines and value-added data assets across the OU Health data ecosystem
  • Design, build, and maintain data pipelines that deliver curated, value-added data assets such as data marts and other purpose-built data stores
  • Ensure data pipelines are optimized, highly reliable, and contain low technical debt
  • Design, build, and maintain the tools and infrastructure needed to handle large datasets
  • Enforce data governance policies including data quality, validation, lineage, metadata management, and adherence to healthcare regulations
  • Develop and implement comprehensive data quality frameworks, addressing issues such as data accuracy, completeness, and consistency
  • Work closely with different application and operational teams to understand business needs and align data engineering initiatives accordingly
  • Guide, mentor, quality review and train Data Engineering team and ETS department on technical skills and best practices

Requirements

  • Bachelor's Degree required
  • 5 or more years in analytics (Business Intelligence, Data Engineering, Data Science, etc.) required
  • Epic certification/accreditation required within 6 months of hire or within 3 months of class completion
  • Expert level analytic skills related to working with structured and unstructured datasets
  • Guide, mentor and train Data Engineering team, Data Scientist and Business Intelligence Developers on technical skills and best practices
  • Must possess critical thinking and creative problem-solving skills along with the ability to communicate well with stakeholders throughout the organization
  • Effective communication, project management and organizational skills
  • Experience supporting and working with cross-functional teams in a dynamic environment
  • Working knowledge of stream processing and highly scalable data stores
  • Previous experience manipulating, processing, and extracting value from large, disconnected datasets
  • Expert level SQL and data manipulation skills
  • Exposure to big data tools: dbt, SnowPark, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Snowflake, MS SQL Server, and Postrgres
  • Experience with integration tools: Fivetran, Matillion, SSIS, dbt, SnowSQL
  • Exposure to stream-processing systems: IBM Streams, Flume, Storm, Spark-Streaming, etc.
  • Exposure to consuming and building APIs
  • Exposure to object-oriented/object function programming languages: Python, Java, C++, Scala, etc.
  • Experience with statistical data analysis tools: R, SAS, SPSS, etc.
  • Experience with visual analytics tools: QlikView, Tableau, Power BI etc.
  • Familiarity to Agile methodology for development
  • Familiarity with electronic health records and financial systems. i.e., Epic Systems, Workday, Strata etc.
  • Ability to work independently and within teams
  • Ability to develop and advise on data asset use to provide solutions to organizational needs
Benefits
  • PTO
  • 401(k)
  • medical and dental plans
  • comprehensive benefits package
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data pipelinesdata quality frameworksSQLdata manipulationstream processingbig data toolsrelational databasesNoSQL databasesstatistical data analysisvisual analytics
Soft Skills
critical thinkingcreative problem-solvingeffective communicationproject managementorganizational skillsmentoringtrainingcollaborationalignment with business needsindependent work
Certifications
Bachelor's DegreeEpic certification/accreditation