WernerCo.

Data Engineer

WernerCo.

full-time

Posted on:

Origin:  • 🇺🇸 United States • Illinois

Visit company website
AI Apply
Manual Apply

Salary

💰 $90,000 - $123,000 per year

Job Level

Mid-LevelSenior

Tech Stack

AzureCloudERPETLMatillionPythonSOAPSQL

About the role

  • ProDriven Brands is seeking a skilled and motivated Data Engineer with a passion for leveraging data, innovative technologies, and artificial intelligence (AI) to elevate our business processes.
  • This role will support efforts to understand business needs, gathering detailed requirements, and translating them into robust, scalable integration and automation solutions across our enterprise systems.
  • The ideal candidate will possess strong analytical skills and have a comprehensive understanding of data integration principles and technologies.
  • This position also involves exploring opportunities to harness data insights and contribute to AI-driven enhancements.
  • Presently this position is in a hybrid status with a minimum of 1 day a month in the office for collaboration, teamwork, and business needs.
  • Responsibilities include: ESSENTIAL JOB FUNCTIONS AND RESPONSIBILITIES
  • Business Requirements Gathering and Analysis: Collaborate effectively with cross-functional teams, including business analysts and application owners, to gather and document detailed integration requirements and deliver solutions.
  • Translate business needs into clear, detailed, and actionable technical specifications for integration development.
  • Solution Design and Development: Design and create data integration solutions based on business requirements and technical specifications.
  • Build and maintain ETL/ELT pipelines using Matillion and Fivetran to ensure the quality and transformation of data within Snowflake for analytical purposes.
  • Develop, implement, and maintain real-time and batch integration workflows and APIs using Jitterbit or equivalent platforms
  • Collaborate with Salesforce developers and administrators to build and maintain integrations with other enterprise systems.
  • Work with the ERP (JDE) team to understand data structures and build effective integrations utilizing appropriate methods.
  • Utilize HVR for configuring and monitoring real-time data replication and change data capture (CDC) processes.
  • Develop and execute comprehensive test plans for integration solutions to ensure data accuracy, performance, and reliability.
  • Participate in code reviews and ensure adherence to integration standards and best practices.
  • Create and maintain a comprehensive integration documentation library, including data dictionaries, pipeline flowcharts, and SLA definitions, ensuring audit-readiness and long-term maintainability.
  • Performs other duties as assigned within the scope of responsibilities and requirements of the job
  • Performs Essential job functions and duties with or without reasonable accommodation
  • Continuous Improvement: Proactively identify opportunities to optimize existing data pipelines and integration processes for enhanced efficiency, data quality, and business value.
  • Stay abreast of advancements in integration technologies, data management trends, and emerging AI capabilities relevant to data integration.
  • Contribute to the development of data engineering and integration standards and guidelines, promoting best practices and innovation.
  • Support efforts to evaluate new tools, including AI/ML, to enhance our data integration capabilities and drive data-driven insights.
  • Assist in assessing emerging technologies that can enhance data integration, automation, or AI-driven insights in alignment with evolving business needs and industry best practices.

Requirements

  • Bachelor's degree in Computer Science, Information Technology or related field preferred or equivalent work experience required
  • Preferred certifications: In relevant integration or cloud technologies
  • 3-5 years of experience in data engineering, with a strong focus on building and maintaining data pipelines and implementing enterprise-wide data integration and automation solutions
  • 3-5 years of hands-on experience in Snowflake Data Warehouse, Matillion ETL, Power BI, HVR, Jitterbit, SQL, and other scripting/development tools (e.g., Python)
  • Knowledge of Agile or other project management methodologies, preferably in a manufacturing environment
  • Azure knowledge and experience with Azure DevOps
  • Experience with other integration tools and technologies (e.g., MuleSoft) is a plus
  • Experience with AI/ML platforms
  • Experience working with JDE data structures and business processes, including implementing integrations using APIs or change data capture (CDC)
  • Understanding of Salesforce APIs (REST, SOAP, Bulk), data models, and integration patterns
  • Travel Requirements: 0% ~ 10% travel domestically required