Tech Stack
AWSAzureCloudERPETLGoogle Cloud PlatformPythonSQLTableau
About the role
- Design, build, and maintain scalable ETL/ELT pipelines to support BI and analytics use cases.
- Collaborate with BI analysts and business stakeholders to understand data needs and translate them into technical requirements.
- Develop and manage data models and data marts that enable efficient reporting and analysis.
- Ensure data quality, integrity, and governance across all BI datasets.
- Optimize data workflows for performance, scalability, and cost-efficiency in cloud and/or on-prem environments.
- Integrate data from multiple sources including internal systems (ERP, CRM, operational databases) and third-party APIs.
- Monitor and troubleshoot data pipelines and workflows to ensure timely data availability.
- Implement and maintain metadata management, data lineage, and documentation for transparency and compliance.
- Support the BI team in developing dashboards, reports, and self-service analytics tools.
Requirements
- Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related field.
- 3+ years of experience in data engineering, preferably supporting BI or analytics functions.
- Proficient in SQL and experience with other programming languages (e.g., Python).
- Hands-on experience with ETL tools and data integration patterns.
- Experience with data warehousing platforms.
- Familiarity with BI tools such as Tableau and Power BI.
- Experience working in cloud environments (AWS, GCP, or Azure).
- Preferred: Experience with CI/CD pipelines for data workflows.
- Preferred: Familiarity with data governance frameworks and privacy regulations (e.g., GDPR, CCPA).
- Preferred: Understanding of agile methodologies and experience working in cross-functional teams.
- Strong problem-solving skills and the ability to communicate technical concepts to non-technical audiences.