DocPlanner

Product Data Engineer

DocPlanner

full-time

Posted on:

Location Type: Remote

Location: Italy

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design, build, and maintain reliable end-to-end ETL pipelines orchestrated with Apache Airflow
  • Integrate data from multiple sources (internal operational databases, third-party APIs, SaaS tools) into the Google Cloud Data Warehouse (BigQuery)
  • Design and evolve data models, warehouse schemas, and transformations to support scalable analytics and KPIs
  • Ensure data quality, reliability, and observability through monitoring, validation, and alerting
  • Own the product data structure, mapping product features and behaviors to analytics-ready data models
  • Define and maintain meaningful KPIs in collaboration with Product and BI
  • Enable analytics for AI-powered product features, ensuring visibility on usage, performance, quality, and business impact
  • Partner with Product, BI, and other stakeholders to gather requirements and deliver dashboards and reports
  • Maintain clear and up-to-date documentation for data models, pipelines, and metrics
  • Act as the primary bridge between Backend Engineering and BI, owning the flow from data production to analytics consumption
  • Triage, analyze, and address BI requests related to data availability, correctness, performance, and modeling
  • Collaborate with Backend Engineers on data contracts, schema evolution, and performance optimization, without owning core backend services
  • Proactively identify and resolve data-related issues impacting BI and Product teams
  • Own first-level monitoring and support for data pipelines and Airflow DAGs, ensuring timely resolution of failures
  • Collaborate with BI and Backend teams to troubleshoot and resolve complex issues
  • Continuously improve the stability, performance, and maintainability of the data platform.

Requirements

  • 2+ years of experience in Data Engineering or a similar role
  • Hands-on experience designing, scheduling, and maintaining ETL pipelines using Apache Airflow
  • Strong SQL skills and solid understanding of data warehousing concepts (preferably Google BigQuery)
  • Proficiency in Python for ETL development and automation
  • Experience working in AI product environments, supporting data needs for AI features such as experimentation, monitoring, and analytics
  • Experience integrating data from multiple sources (APIs, databases, flat files, external platforms)
  • Experience building dashboards or analytical views using BI tools (preferably Looker)
  • Familiarity with Google Cloud Platform (GCP) services
  • Strong analytical and problem-solving skills
  • Comfortable working in a cross-functional, ambiguous environment
  • Strong communication skills and ability to collaborate with both technical and non-technical stakeholders
  • Strong interest in product data and how data drives product decisions.
Benefits
  • 100% remote work, with the option to join our offices in Bologna or Barcelona
  • One extra day off for your birthday
  • Access to iFeel – our mental wellbeing platform
  • €8/day meal vouchers – lunch is covered if you're in the Bologna office
  • Private health coverage via Metasalute
  • Access to the “Study in Action” platform for continuous learning and professional development
  • Comprehensive private health insurance with Adeslas (Spain specific)
  • Flexoh – flexible compensation platform (Spain specific)
  • Wellhub – gym & wellness network membership (Spain specific)
  • Language courses (Spain specific)
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
ETL pipelinesApache AirflowSQLdata warehousingGoogle BigQueryPythondata integrationdashboard developmentBI toolsdata modeling
Soft Skills
analytical skillsproblem-solving skillscommunication skillscollaborationcross-functional teamworkadaptability