Salary
💰 $90,000 - $123,000 per year
Tech Stack
AzureCloudERPETLMatillionPythonSOAPSQL
About the role
- ESSENTIAL JOB FUNCTIONS AND RESPONSIBILITIES: Collaborate effectively with cross-functional teams, including business analysts and application owners, to gather and document detailed integration requirements and deliver solutions. Translate business needs into clear, detailed, and actionable technical specifications for integration development.
- Solution Design and Development: Design and create data integration solutions based on business requirements and technical specifications. Build and maintain ETL/ELT pipelines using Matillion and Fivetran to ensure the quality and transformation of data within Snowflake for analytical purposes. Develop, implement, and maintain real-time and batch integration workflows and APIs using Jitterbit or equivalent platforms
- Collaborate with Salesforce developers and administrators to build and maintain integrations with other enterprise systems. Work with the ERP (JDE) team to understand data structures and build effective integrations utilizing appropriate methods. Utilize HVR for configuring and monitoring real-time data replication and change data capture (CDC) processes. Develop and execute comprehensive test plans for integration solutions to ensure data accuracy, performance, and reliability. Participate in code reviews and ensure adherence to integration standards and best practices. Create and maintain a comprehensive integration documentation library, including data dictionaries, pipeline flowcharts, and SLA definitions, ensuring audit-readiness and long-term maintainability. Performs other duties as assigned within the scope of responsibilities and requirements of the job Performs Essential job functions and duties with or without reasonable accommodation Continuous Improvement: Proactively identify opportunities to optimize existing data pipelines and integration processes for enhanced efficiency, data quality, and business value. Stay abreast of advancements in integration technologies, data management trends, and emerging AI capabilities relevant to data integration. Contribute to the development of data engineering and integration standards and guidelines, promoting best practices and innovation. Support efforts to evaluate new tools, including AI/ML, to enhance our data integration capabilities and drive data-driven insights. Assist in assessing emerging technologies that can enhance data integration, automation, or AI-driven insights in alignment with evolving business needs and industry best practices.
Requirements
- Bachelor's degree in Computer Science, Information Technology or related field preferred or equivalent work experience required
- 3-5 years of experience in data engineering, with a strong focus on building and maintaining data pipelines and implementing enterprise-wide data integration and automation solutions
- 3-5 years of hands-on experience in Snowflake Data Warehouse, Matillion ETL, Power BI, HVR, Jitterbit, SQL, and other scripting/development tools (e.g., Python)
- Experience with Agile or other project management methodologies, preferably in a manufacturing environment
- Azure knowledge and experience with Azure DevOps
- Knowledge of DevOps practices, including building and maintaining CI/CD pipelines using tools like Azure DevOps for deploying ETL and data integration workflows
- Experience with AI/ML platforms; JDE data structures; Salesforce APIs; API design; ETL; data integration
- Travel requirements: 0% - 10%