
IT Cloud Data Engineer – HVR, DBT
Advocate Aurora Health
full-time
Posted on:
Location Type: Remote
Location: Remote • Illinois • 🇺🇸 United States
Visit company websiteSalary
💰 $40 - $60 per hour
Job Level
Mid-LevelSenior
Tech Stack
CloudETLPythonSQL
About the role
- Drive scope definition, requirements analysis, data and technical design, pipeline build, product configuration, unit testing, and production deployment
- Design scalable ingestion processes to bring on-prem, API drive, 3rd party, end user generated data sources to integrate in common cloud infrastructure
- Design reusable assets, components, standards, frameworks, and processes to accelerate and facilitate data integration projects
- Develop data integration and transformation jobs using Python, SQL and ETL /ELT tools
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Build processes supporting data transformation, data structures, metadata, dependency and workload management
- Develop and implement scripts for data process maintenance, monitoring, and performance tuning
- Test and document data processes through data validation and verification procedures
- Ensure delivered solutions meet/perform to technical and functional/non-functional requirements
- Provide technical guidance and mentorship to junior engineers, ensuring best practices in data engineering
Requirements
- Bachelor's Degree in Computer Science or related field
- Typically requires 5 years of experience in at least two IT disciplines, including database management, cloud engineering, data engineering and middleware technologies
- Includes 2 years of work experience with cloud platforms, including experience with data integration, performance optimization, and platform administration
- Experience defining, designing, and developing solutions with data integration platforms/tools
- Proven experience building and optimizing data pipelines, and data sets
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Must have experience in data transformation and data pipeline development using GUI based tools or programming languages like SQL and Python
- Proficiency in Python and SQL for scripting and building data transformation processes is preferred
- Must have experience with DevOps tool chains and processes
Benefits
- Paid Time Off programs
- Health and welfare benefits such as medical, dental, vision, life, and Short- and Long-Term Disability
- Flexible Spending Accounts for eligible health care and dependent care expenses
- Family benefits such as adoption assistance and paid parental leave
- Defined contribution retirement plans with employer match and other financial wellness programs
- Educational Assistance Program
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonSQLETLELTdata integrationdata transformationdata pipeline developmentdata validationperformance tuningdatabase management
Soft skills
technical guidancementorshipbest practices
Certifications
Bachelor's Degree in Computer Science