FREE ACCESS
5,000–10,000 jobs/day

See all jobs on JobTailor
Search thousands of fresh jobs every day.
Discover
- Fresh listings
- Fast filters
- No subscription required
Create a free account and start exploring right away.
Tech Stack
Tools & technologiesBigQueryCloudGoogle Cloud PlatformSQLTerraformTypeScript
About the role
Key responsibilities & impact- Design and industrialize data ingestion, transformation, and exposure pipelines on Google Cloud Platform (Dataflow, Cloud Functions, Pub/Sub, Cloud Composer)
- Model and manage a BigQuery data warehouse following a layered architecture (Data Warehouse / Data Mart) optimized for large-scale analytics
- Write and optimize complex SQL queries (window functions, partitioning, clustering, scan cost optimization)
- Implement data quality controls: consistency tests, anomaly detection, alerting
- Deploy and version data infrastructure using Terraform (BigQuery, GCS, IAM, service accounts, scheduled queries…)
- Set up CI/CD workflows to ensure reproducibility and traceability of deployments
- Document Terraform modules and maintain a resource catalog
- Build and maintain Looker Studio dashboards connected to BigQuery
- Ensure report performance (query optimization, caching, extracts)
- Maintain consistency of metrics across different dashboards
- Contribute to defining Data standards (naming conventions, SQL code review, documentation)
- Stay up to date on the GCP ecosystem and propose improvements
- Mentor junior profiles on Data Engineering best practices
Requirements
What you’ll need- Minimum 5 years of professional experience in Data Engineering after graduation
- Strong expertise in Google Cloud Platform: BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer, IAM
- Excellent command of advanced SQL (modeling, optimization, large volumes)
- Solid experience with Terraform (modules, state, CI/CD)
- Good knowledge of Looker Studio or equivalent data visualization tools
- Rigor, autonomy and strong analytical skills
- Team spirit and enthusiasm for working in a complex data environment
Benefits
Comp & perks- Swile card (meal vouchers) credited €10/day worked, 60% covered by YEVO
- 9 to 12 RTT days (time off in lieu)
- 100% public transport subscriptions or contribution to transport costs
- Works council (CSE)
- Optional health insurance
- 100% employer-covered income protection
- Training and support for skills development
ATS Keywords
✓ Tailor your resumeApplicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data ingestiondata transformationdata exposure pipelinesSQLdata quality controlsTerraformCI/CD workflowsdata modelingdata warehousingdata analytics
Soft Skills
analytical skillsautonomyteam spiritmentoringcommunicationrigorenthusiasm
