360Learning

Data Engineer

360Learning

full-time

Posted on:

Origin:  • 🇪🇸 Spain

Visit company website
AI Apply
Apply

Job Level

JuniorMid-Level

Tech Stack

AirflowAmazon RedshiftAWSAzureBigQueryCloudETLGoogle Cloud PlatformPythonSQL

About the role

  • Build and run 360Learning’s data platform that powers product features and company-wide insights
  • Process billions of events monthly and maintain over 30TB of data with 200+ data pipelines
  • Scale the platform and deliver accurate, reliable, well-structured data
  • Work embedded within the Engineering Tech squad alongside DevOps and Architects
  • Take ownership of data pipelines end-to-end (design, build, monitor)
  • Optimize SQL/dbt models and ETL pipelines for performance and cost
  • Collaborate with Analysts and Product squads to deliver datasets and reliable pipelines
  • Improve observability (logging, alerting, monitoring) in the data stack
  • Design and implement real-time pipelines to serve Product features
  • Drive initiatives around data reliability, scalability, and cost optimization
  • Help define the Data Engineering roadmap and lead design of new components
  • Contribute to embedding AI into teams across the company and in client-facing solutions

Requirements

  • 2+ years of experience in a Data Engineering or in a similar role
  • Experience with cloud providers (AWS, Azure, GCP)
  • Proficient in SQL and Python
  • Experience with an orchestration tool (e.g. Airflow)
  • Experience with a data warehouse (Snowflake, BigQuery, Redshift)
  • Strong understanding of data modelling, warehousing principles, and performance optimization techniques
  • Ability to break down ambiguous problems into concrete, manageable components
  • Experience managing different types of interlocutors while exposing technical & business problems
  • Listening skills: open to input from other team members and departments
  • Fluent English (US/UK) / B2 level or equivalent (FR)