thyssenkrupp

Data Engineer

thyssenkrupp

full-time

Posted on:

Location Type: Remote

Location: MissouriUnited States

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Develop, maintain, and monitor data ingestion and enrich ETL/ELT pipelines within the platform that load and convert raw data into data products.
  • Partner with regional and/or global IT infrastructure teams to support and configure data platform storage and compute layers.
  • Maintain and build CI/CD pipeline code and automated test plans to ensure automated deployment between development and production environments.
  • Manage data platforms related to ITSM ticketing processes (incident & change requests).
  • Collaborate with data team members, architects, data stewards, data owners, and business SMEs to develop data product business requirements for data cleansing and enrichment.
  • Implement data security and data governance policies within the data platform to protect sensitive information and maintain data quality.
  • Partner with cybersecurity and compliance teams to rigorously ensure compliance with applicable data security, data protection, and regulation requirements.
  • Partner with global data teams to ensure that the local data platform & products maintain interoperability.
  • Continuously identify and drive opportunities to improve platform performance, reduce complexity and technical debt and reduce cloud computing and storage costs.
  • Maintain support documentation within the team repository.
  • Escalate data platform issues to the attention of management / appropriate partners.
  • Leverage agile frameworks and Azure DevOps to execute the team backlog.
  • Implement data platform metadata management standards and policies.

Requirements

  • Bachelor’s degree, or equivalent work experience.
  • Minimum 1+ years of experience in IT Industry and 1 year of experience as a Data Engineer or equivalent work experience.
  • At least 1 year of prior data engineering experience of building ETL/ELT data pipelines using Airflow, Fivetran, Qlik, DBT, ADF, Snowpipe, Matillion and/or similar tools to load and transform structure and semi-structured data.
  • At least 1 year of experience building and supporting cloud native databases, warehouses data lakes, lake houses, and/or data repositories.
  • At least 1 year of experience writing python and/or SQL to manipulate, transform, and load large disparate data sets.
  • Advanced working SQL knowledge and prior experience loading data into relational databases, data warehouses, and/or data lakes on Microsoft Azure Cloud or other cloud platform.
  • Ability to concisely and completely document business requirements into user stories within Azure DevOps and create related technical documentation.
  • Basic working knowledge of implementing identity and access management (IAM) and data object, column, and row level access controls to protect sensitive information.
  • Fundamental understanding of implementing DevOPs, code management, versioning/branching, and automated testing concepts.
  • Experience with leveraging IDEs (e.g., Visual Studio Code and/or Snowflake Notebooks).
  • Have a fundamental understanding of how to effectively leverage elastic compute and interoperable storage to scale data pipelines/processing up and out.
  • Strong project management and organizational skills.
  • Experience with performing root cause analysis to answer specific business questions, resolve issues, & identify improvement opportunities.
  • Experience designing and implementing relational, star schema, and snowflake data models.
  • Excellent verbal and written communication skills; ability to provide meaningful, regular status communications to various levels of leadership.
  • Proven track record of driving continuous personal improvement by learning and adapting quickly to the latest technologies and trends.
  • Familiarity with SAP supply chain data and/or within aerospace is preferred but not required.
Benefits
  • Medical, Dental, Vision Insurance
  • Life Insurance and Disability
  • Voluntary Wellness Programs
  • 401(k) and RRSP programs with Company Match
  • Paid Vacation and Holidays
  • Tuition Reimbursement
  • And more! Benefits may vary based on job, country, union role, and/or company segment.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
ETLELTdata pipelinesPythonSQLdata modelingDevOpsdata governancemetadata managementcloud databases
Soft Skills
project managementorganizational skillscommunication skillscollaborationproblem solvingdocumentationcontinuous improvementagile methodologiesroot cause analysisbusiness requirements analysis