
Data Engineer
OnBuy
full-time
Posted on:
Location Type: Remote
Location: United Kingdom
Visit company websiteExplore more
Tech Stack
About the role
- Developing and scaling analytics infrastructure using modern cloud-based data platforms and tooling (e.g., BigQuery, Snowflake, Databricks).
- Designing, building, and maintaining robust data pipelines to ingest, transform, and deliver high-quality datasets for analytics and reporting.Owning and evolving the semantic data layer, ensuring clean, well-modelled datasets that enable self-serve analytics and data-driven decision making.
- Collaborating with the analytics team, business stakeholders and tech function to understand requirements and deliver scalable solutions that meet business needs.
- Driving innovation through the development of data products, such as feature stores, automated insights, or ML-ready datasets.
Requirements
- Hands-on experience developing and managing cloud based data warehousing environments (Bigquery, Snowflake, Redshift)
- Designing, building, and maintaining robust data pipelines to ingest, transform, and deliver high-quality datasets for analytics and reporting.
- Practical experience across GCP services including IAM, Cloud Run, Artifact Registry, GKE, BigQuery, GCS, and Datastream.
- An understanding of data orchestration (Apache Airflow or other DAG focussed solutions preferable).
- Collaborating with the analytics team, business stakeholders and tech function to understand requirements and deliver scalable solutions that meet business needs.
- Knowledge of ETL / ELT tools and software such as Airbyte, Fivetran or Stitch.
- Experience with containerisation and orchestration (Docker, Kubernetes, Helm).
- Understanding of CI/CD workflows (GitLab CI/CD, GitHub Actions preferred).
- The ability to create and manage multiple data pipelines through development environments into production.
- A basic understanding of MySQL architecture for application data replication purposes.
- Experience of extracting data from REST APIs and ingesting into warehousing environments.
- Basic GCP administration experience (Terraform working knowledge would be a nice to have).
- Coding Skills:
- SQL - ability to write complex SQL queries for normalisation data model creation.
- Python - working experience with the ability to write DAGs to extract data from third party APIs.
- Experience with version control using Git.
- An understanding of Data security, cloud permission management and data storage (cross country/continent).
Benefits
- Company Equity- In return for helping us to grow, we’ll offer you company equity, meaning you own a piece of this business we are all working so hard to build.
- 25 days annual leave + Bank Holidays
- 1 extra day off for your Birthday
- Employee Assistance Programme
- Perks at Work benefit platform
- Opportunities for career development and progression
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data pipelinesETLELTSQLPythondata warehousingdata orchestrationcontainerizationCI/CDdata security
Soft skills
collaborationinnovationproblem-solvingcommunicationstakeholder engagement