Salary
💰 $114,900 - $172,300 per year
Tech Stack
AzureCloudERPETLFlaskJavaScriptNode.jsPySparkPythonSQL
About the role
- Ecolab is looking for a Sr Data Engineer based in St. Paul, MN or Naperville, IL to join the Analytics and Digital Solutions Team within Ecolab Digital in support of Global Supply Chain.
As a technical lead, you'll drive continuous improvements in our digital capabilities and advanced analytics.
Lead the development of new digital products, providing critical insights to solve business challenges.
Enhancing data utilization across the organization through improved processes, governance, and data management.
Building a strong data foundation and adopting modern data architecture for current and future analytical needs.
Lead data initiatives: Serve as a liaison among stakeholders to analyze and define data requirements for reporting and business process changes.
Manage data infrastructure: Proactively manage Snowflake and SQL databases and analytical data models.
Drive data excellence: Develop, test, and tune semantic models for enterprise reporting, ensuring compliance with IT security requirements.
Advance data architecture: Lead the adoption of modern data architecture and identify opportunities to solve business problems with state-of-the-art solutions.
Perform reverse engineering: Analyze and understand existing complex data structures and processes to facilitate migrations, integrations, or improvements.
Foster best practices: Mentor peers on implementing and improving our data management and governance framework across technologies like Microsoft Power BI, Snowflake, Microsoft Azure, and on-premise data points.
Promote Agile methodologies: Champion and follow SCRUM/Agile frameworks.
Requirements
- Bachelor’s degree in Mathematics, Statistics, Computer Science, Information Technology, or Engineering
Immigration Sponsorship is not available for this position.
5 years in a data engineering, analytics, or business intelligence role.
Experience working in applications within supply chain, especially procurement domain.
Strong experience with cloud data warehouses, specifically Snowflake, streams, tasks and the Azure Platform (SQL Server, Logic Apps, App Services, Data Factory, Power BI including pipelines, Lakehouse, and Warehouse).
Proficiency in ETL data engineering, dimensional data modelling, master data management, data governance, and end-to-end data lineage documentation.
Advanced SQL skills (cursors, triggers, CTEs, procedures, functions, external tables, dynamic tables, security roles) and Python (object-oriented programming, handling JSON/XML).
Experience building ETL or ELT data pipelines using Snowflake streams, tasks, store procedures and Fivetran/DBT tools.
Experience with medallion data architecture framework.
Expertise in analytical application development utilizing Python, Streamlit, Flask, Node.js, Graph API, Power Apps tools.
Deploying application in Azure app services.
2 years of Agile/Scrum project management experience, including data requirements gathering and project ownership.