Salary
💰 $101,475 - $152,213 per year
Tech Stack
AWSAzureCloudETLGoogle Cloud Platform
About the role
- Define and maintain the product roadmap for data engineering initiatives in collaboration with Data Platform Leads and stakeholders
- Translate business requirements into technical specifications and user stories
- Prioritize backlog items based on business value, technical feasibility, and strategic alignment
- Understand and articulate the data journey — from source systems to data pipelines, storage, transformation, and consumption
- Collaborate with engineering teams to design scalable, secure, and efficient data solutions
- Ensure data quality, lineage, and governance are embedded in all deliverables
- Lead sprint planning, backlog grooming, and retrospectives using Agile methodologies
- Manage JIRA boards, epics, and stories to ensure timely and quality delivery
- Coordinate with the Release Manager to plan and communicate release schedules and deployment timelines
- Serve as the primary point of contact for data engineering initiatives across business units
- Communicate progress, risks, and dependencies clearly and regularly
- Facilitate cross-functional collaboration to align on priorities and resolve blockers
- Maintain comprehensive documentation of data flows, architecture decisions, and product features
- Create and share release notes, deployment plans, and user guides
- Ensure stakeholders are informed and engaged throughout the product lifecycle
Requirements
- Bachelor's or Master's degree in software engineering / data engineering or any engineering discipline
- 3+ years of experience in data engineering, analytics, or technical product ownership
- Strong understanding of data platforms, ETL/ELT pipelines, cloud technologies (e.g., AWS, Azure, GCP), and data governance
- Hands-on experience with Agile tools like JIRA, Confluence, and version control systems
- Excellent communication, collaboration, and stakeholder management skills
- Experience working with cross-functional teams in a fast-paced environment
- Preferred Skills: Familiarity with modern data stack tools (e.g., Databricks dbt, Snowflake, )