Tech Stack
BigQueryCloudETLGoogle Cloud PlatformKafkaOracleSQL
About the role
- Support migration of data models from Oracle to BigQuery using DBT and ELT pipelines.
- Develop, optimise, and maintain analytical data models following the Medallion architecture within BigQuery.
- Translate complex Oracle table structures into scalable GCP data models to support analytical use cases.
- Work with ingestion pipelines leveraging Kafka (batch), Dataflow to ensure reliable data availability in BigQuery.
- Contribute to build-out of the analytical data warehouse, ensuring data quality and governance standards.
- Engage with prioritised migration plans targeting 60% migration within the year and full migration by mid-2025.
- Participate in code freezes and project deliverables; onboarding into Rightmove targeted for late October/early November.
Requirements
- Strong proficiency in SQL and data modeling best practices.
- Hands-on experience with GCP BigQuery and cloud-based ETL/ELT workflows.
- Practical knowledge of DBT for data transformation and modeling.
- Experience working with data ingestion from batch/streaming sources such as Kafka and Dataflow or similar tools.
- Prior involvement in data platform migrations, particularly transitioning from Oracle data warehouses to cloud platforms, is highly desirable.
- Ability to work effectively in a collaborative, agile team environment focused on delivering large-scale data migration projects.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
SQLdata modelingDBTETLELTBigQuerydata ingestiondata transformationdata qualitydata governance
Soft skills
collaborationagile methodologyproject managementcommunicationproblem-solving