Northwest Permanente

Data Engineer Consultant – Hybrid

Northwest Permanente

full-time

Posted on:

Location Type: Hybrid

Location: PortlandOregonUnited States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Develops data warehouse solutions through ingesting, integrating, and curating data, building database solutions, and administering systems to deliver enriched data to business intelligence analysts and data scientists
  • Utilizes Spark SQL and Python in Databricks to build and optimize scalable data pipelines
  • Assists in efforts to facilitate, analyze, design, and execute architecture solutions and ensure solutions are leveraged; assists in creating documents and communicates the integration approach for components of the solution
  • Applies knowledge of ETL design, development, and processes, including building and orchestrating Databricks workflows and using Spark SQL and Python for data transformation
  • Takes accountability for ensuring specific interfaces, methods, parameters, procedures, and functions support technical solutions and are aligned with architectural designs
  • Assists in efforts to identify and design solutions that allow performing root cause analyses to enable proactive issue resolution and data quality maintenance; assists in efforts to build processes and diagnostics tools/measures into the development process so that data pipelines can be monitored, and issues detected proactively
  • Gathers scope and business requirements for data infrastructure projects
  • Assists in efforts to translate business requirements and functional specifications into physical program designs, code modules, stable and reliable data solutions by partnering with analysts and other team members to understand business needs and functional specifications
  • Assists in efforts to create data dictionaries to document data lineages, data definitions, transformations, and metadata for data infrastructure projects; identifies and reconciles inconsistencies in data definitions; takes steps to assure metadata accuracy and validity
  • Documents programming changes and design, system modifications and their associated maintenance
  • Assists in executing strategies and plans for data security, backup, recovery, business continuity, and archiving
  • Identifies and develop opportunities for data reuse, migration, or retirement
  • Reviews and verifies resource estimates for technical design, coding, and testing efforts
  • Practices self-leadership and promotes learning in others by soliciting and acting on performance feedback; builds collaborative, cross-functional relationships.

Requirements

  • Bachelor’s degree in Computer Information Systems, Computer Science, or related field
  • Three (3) years of hands-on Databricks experience (Notebooks, Jobs, Workflows) in ETL design, implementation, and maintenance using technologies using Python, Spark SQL, Delta Lake, GitHub, SFTP, and TCP/IP
  • Four (4) years of experience in IT with a broad range of exposure to all aspects of business planning, systems analysis, application development, and/or data warehouse development using SDLC methodologies
  • Four (4) years data analytical experience
  • Two (2) years of hands-on experience in architecture, data modeling, and implementation of the enterprise data solutions
  • Strong passion for data analytics and fostering a data driven culture
  • Familiarity with legal, security and regulatory issues associated with managing and exposing sensitive institutional data
  • Experience working with relational databases, data extraction and manipulation language, and in architecture principles and techniques across master data, transaction data and derived/analytic data
  • Experience in the use of data warehouse ETL methodologies and tools.
Benefits
  • 15% employer contribution to retirement programs, including pension
  • 90% employer-paid health plan
  • Tuition Reimbursement
  • Child Care Benefits
  • Flexible Work Schedules
  • Paid Parental Leave
  • Self-Care Days + Paid Time Off
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
PythonSpark SQLETL designDatabricksdata modelingdata analyticsdata transformationdata warehousingSDLC methodologiesdata extraction
Soft Skills
self-leadershipcollaborationcommunicationproblem-solvinganalytical thinkingperformance feedbackcross-functional relationshipsaccountabilityproactive issue resolutionbusiness requirements analysis