Salary
💰 $88,150 - $157,850 per year
Tech Stack
AirflowApacheAWSAzureCloudETLGoogle Cloud PlatformKafkaSpark
About the role
- Support the development and execution of data ingestion and movement platform vision aligned with business goals and customer needs
- Help create and maintain a clear, prioritized roadmap for data ingestion and movement capabilities
- Evangelize the Data Ingestion and Movement platform across the organization and drive stakeholder alignment
- Stay abreast of industry trends and competitive landscape to inform data ingestion strategy (Apache Kafka, Apache Airflow, AWS Glue, Azure Data Factory, Google Cloud Dataflow, etc.)
- Support requirement gathering and product strategy for data ingestion, ETL/ELT pipelines, and data movement services
- Understand end-to-end data ingestion workflows and how data movement fits into the broader data ecosystem and downstream analytics
- Support data governance initiatives for data lineage, quality, and compliance
- Ensure data ingestion and movement processes adhere to regulatory, compliance, and data quality standards
- Partner with engineering on the development of data ingestion tools, pipeline orchestration services, and data movement capabilities
- Help define product capabilities for pipeline monitoring, error handling, and data quality validation
- Support customer roadshows and training on data ingestion and movement capabilities
- Build instrumentation and observability into data ingestion and movement tools to enable data-driven product decisions and pipeline monitoring
- Work closely with engineering, data engineering, and data teams to ensure seamless delivery of data ingestion and movement products
- Partner with customer success, support, and engineering teams to create clear feedback loops
- Translate technical capabilities into business value and user benefits and support alignment across multiple stakeholders and teams in complex environments
Requirements
- Understanding of data ingestion patterns, ETL/ELT processes, and data pipeline architectures (Apache Kafka, Apache Airflow, Apache Spark, AWS Glue, etc.)
- Experience with data integration APIs, connectors, and data pipeline orchestration tools
- Basic understanding of data pipeline monitoring, observability, and data quality validation practices
- Experience in cloud data ecosystems (AWS, GCP, Azure)
- Proven analytical and problem-solving abilities with a data-driven approach to decision-making
- Experience working with Agile methodologies and tools (JIRA, Azure DevOps)
- Good communication, stakeholder management, and cross-functional collaboration skills
- Strong organizational skills with ability to manage product backlogs
- Minimum 5+ years of technical product management experience building platforms that support data ingestion, ETL/ELT pipelines, data engineering, and data infrastructure
- Track record of delivering successful products in fast-paced environments and supporting complex, multi-stakeholder initiatives
- Proven ability to work with technical teams and translate business requirements into technical product specifications
- Experience with customer research, user interviews, and data-driven decision making
- Bachelor's degree in computer science, engineering, management information systems, or related technical field required
- MBA/MS or equivalent experience preferred
- At this time, GEICO will not sponsor a new applicant for employment authorization for this position.