Expleo Group

Data Engineer – Data Ingestion Platforms

Expleo Group

full-time

Posted on:

Location Type: Hybrid

Location: BucharestRomania

Visit company website

Explore more

AI Apply
Apply

About the role

  • Develop and maintain batch and streaming data ingestion pipelines using established patterns
  • Implement data extraction from APIs, databases, filesystems, and event-based sources
  • Write clean, testable, and well-documented Python code aligned with team standards
  • Contribute to automation, monitoring, and logging of data pipelines
  • Participate in code reviews, integration testing, and agile delivery ceremonies
  • Support operational activities including incident investigation and hotfix implementation
  • Apply cost-aware practices when using cloud resources

Requirements

  • Solid experience with Python for data engineering (unit testing, packaging, documentation)
  • Hands-on exposure to PySpark and Pandas for data processing
  • Basic experience with streaming or event-driven architectures
  • Familiarity with containerization and CI/CD workflows (e.g., Docker, GitHub Actions)
  • Experience working in cloud-based data platforms, preferably AWS and Databricks
  • Practical use of AI-assisted coding tools (e.g., GitHub Copilot) in daily development
Benefits
  • Holiday Voucher
  • Private medical insurance
  • Performance bonus
  • Easter and Christmas bonus
  • Employee referral bonus
  • Bookster subscription
  • Work from home options depending on project

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
PythonPySparkPandasunit testingpackagingdocumentationstreaming architecturesevent-driven architecturesAI-assisted coding toolscost-aware practices
Soft skills
clean codetestable codewell-documented codeautomationmonitoringloggingcode reviewsintegration testingagile deliveryincident investigation