Collaborate with Data Engineers to support the migration of data models from Oracle to BigQuery using DBT and ELT pipelines.
Develop, optimise, and maintain analytical data models following the Medallion architecture within BigQuery.
Translate complex Oracle table structures into scalable GCP data models to support analytical use cases.
Work with ingestion pipelines leveraging Kafka (in batches), Dataflow, to ensure reliable data availability in BigQuery.
Contribute to the build-out of the analytical data warehouse, ensuring data quality and governance standards are upheld.
Engage with prioritised migration plans to lift and shift tables and data models, targeting 60% migration completion within the year and full migration by mid-2025.
Participate in code freezes and deliverables aligned with project timelines, with onboarding into Rightmove targeted for late October/early November.
Requirements
Strong proficiency in SQL and data modeling best practices.
Hands-on experience with GCP BigQuery and cloud-based ETL/ELT workflows.
Practical knowledge of DBT for data transformation and modeling.
Experience working with data ingestion from batch/streaming sources such as Kafka and Dataflow or similar tools.
Prior involvement in data platform migrations, particularly transitioning from Oracle data warehouses to cloud platforms, is highly desirable.
Ability to work effectively in a collaborative, agile team environment focused on delivering large-scale data migration projects.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.