Advocate Aurora Health

Cloud Data Integration Architect

Advocate Aurora Health

full-time

Posted on:

Origin:  • 🇺🇸 United States

Visit company website
AI Apply
Manual Apply

Salary

💰 $62 - $93 per hour

Job Level

SeniorLead

Tech Stack

AirflowCloudPythonSQL

About the role

  • Develop solutions architecture and evaluate architectural alternatives.
  • Define and articulate an integration strategy, architecture, design patterns, and standards.
  • Drive scope definition, requirements analysis, data and technical design, pipeline build, product configuration, unit testing, and production deployment.
  • Introduce new integration solutions from design to rollout.
  • Design scalable ingestion processes to bring on-prem, API driven, 3rd party, end user generated data sources into common cloud infrastructure.
  • Design reusable assets, components, standards, frameworks, and processes to accelerate and facilitate data integration projects.
  • Provide technical leadership and guidance to development teams, ensuring best practices in data integration.
  • Diagnose and resolve issues, provide workarounds or escalate to service owners.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Ensure delivered solutions meet functional and non-functional requirements and committed timeframes.
  • Maintain overall industry knowledge on latest trends and technologies.

Requirements

  • Bachelor's Degree in Computer Science or related field.
  • Typically requires 7 years of experience in at least two IT disciplines, including technical architecture, application development, middleware, database management or operations.
  • Capable of communicating and presenting complex information to technical and nontechnical stakeholders.
  • Experience in developing and producing Integration and Technical Architecture Documents to include Requirements, Conceptual Technical Models, Platform, Performance and Operational Requirements
  • Experience defining, designing, and developing solutions with cloud data integration platforms/tools
  • Proven experience building and optimizing data pipelines, and data sets.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Hands-on experience working with cloud based modern ELT tools and technologies like Fivetran, HVR, dbt, Airflow etc.
  • Proficiency in Python and SQL for scripting and building data transformation processes is preferred.
  • Experience in test automation with a focus on testing integrations, including APIs and data flows between enterprise systems.
  • Must have experience end-to-end implementation of cloud data warehouse solutions
  • Must have experience with DevOps tool chains and processes.
  • Must have hands-on experience with Snowflake Data Cloud.
  • Must have experience of Agile development methodologies
  • Experience with healthcare data systems and data models is a plus.