Homeprotect Home Insurance

Lead GCP Data Engineer

Homeprotect Home Insurance

full-time

Posted on:

Location Type: Hybrid

Location: New MaldenUnited Kingdom

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Shape implementation choices and guide the team through trade-offs (cost, reliability, security, operability, maintainability).
  • Set and maintain engineering standards (patterns, templates, coding practices, testing expectations, documentation).
  • Create technical clarity through lightweight design notes and ADRs that help the team move faster with fewer surprises.
  • Lead end-to-end delivery of data products on GCP, from onboarding new sources through to curated datasets, marts, and business-facing outputs.
  • Own ingestion design and implementation at a platform level: selecting and refining patterns, establishing reusable templates, and ensuring ingestion is reliable, scalable, and easy to operate.
  • Build and improve transformation and modelling workflows in the warehouse, with a strong focus on correctness, performance, and cost control.
  • Work in an Agile way: contribute to planning, estimation, delivery cadence, and continuous improvement.
  • Maintain a strong focus on delivering measurable value to stakeholders, prioritising work that improves decision making and accelerates insights aligned with business needs.
  • Design pipelines and jobs to be production-grade: idempotent, observable, resilient to failure, and easy to support.
  • Champion automated testing and release practices so changes can be safely promoted through environments with minimal manual effort.
  • Mentor and upskill data engineers through pairing, structured reviews, knowledge sharing, and best practise guidance.
  • Be a go-to technical reference for the team, providing calm, high-quality support when work is ambiguous or high pressure.

Requirements

  • 5+ years’ experience as a Data Engineer, including building and operating production-grade data pipelines and data products.
  • 3+ years building on GCP in a production environment.
  • Strong ability to design and implement reliable data pipelines and make good engineering trade-offs.
  • Strong hands-on experience with: BigQuery, Cloud Storage (GCS), Terraform, dbt, Kafka (and streaming data concepts), Dataflow, Cloud Run (jobs and or services), Cloud Composer (Airflow) or equivalent orchestration tooling
  • Experience designing and operating pipelines across multiple ingestion patterns, including CDC, APIs, SFTP, streaming etc.
  • Strong understanding of medallion architecture (Bronze, Silver, Gold) and how to implement it in a modern analytics platform.
  • Strong experience working within Agile methodologies (Scrum) and contributing positively to team rituals and delivery cadence.
  • Strong communication: you can explain technical decisions clearly to engineers and non-engineers, and you document just enough to scale the team.
  • GCP certification (Professional Data Engineer or similar).
  • Experience in regulated domains (insurance, finance).
Benefits
  • 📊 Check your resume score for this job Improve your chances of getting an interview by checking your resume score before you apply. Check Resume Score
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringdata pipelinesdata productsdesign patternsdata transformationdata modelingautomated testingrelease practicesmedallion architectureAgile methodologies
Soft Skills
communicationmentoringteam collaborationproblem-solvingtechnical guidancestakeholder managementdecision makingcontinuous improvementdocumentationsupport under pressure
Certifications
GCP Professional Data Engineer