ALTEN

Senior Data Engineer – Data Platform

ALTEN

contract

Posted on:

Location Type: Remote

Location: Remote • 🇲🇦 Morocco

Visit company website
AI Apply
Apply

Job Level

Senior

Tech Stack

ApacheAWSBigQueryCloudDockerGoogle Cloud PlatformJavaKafkaKubernetesLinuxScalaSparkSQL

About the role

  • Design and implement high-performance data pipelines on GCP (BigQuery, Bigtable, Dataproc)
  • Develop distributed processing workloads using Spark and Kafka
  • Contribute to the implementation of CI/CD practices (GitLab, Docker) and the automation of data workflows
  • Ensure code quality through unit and integration tests
  • Collaborate with development and production teams in an Agile (Scrum) environment

Requirements

  • Master's degree (Bac+5) in Computer Science or equivalent
  • Experience: 7+ years, with strong knowledge of Cloud environments (GCP or AWS) and distributed processing technologies (Kafka, Spark, BigQuery)
  • Development: Scala and/or Java (expert level)
  • Data technologies: Apache Spark, Apache Kafka, BigQuery, Bigtable
  • Cloud: GCP (Cloud Storage, Dataproc, Kubernetes) or equivalent AWS services
  • CI/CD: GitLab CI/CD, Docker Compose
  • Advanced SQL, Linux, and scripting
  • Agile/Scrum methodology
  • Strong written and verbal communication skills (French/English)
Benefits
  • Flexible work arrangements

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data pipelinesGCPBigQueryBigtableDataprocSparkKafkaScalaJavaSQL
Soft skills
communication skillscollaborationcode qualityAgileScrum
Certifications
Master's degree in Computer Science