MagicOrange

Senior Data Engineer – Architect

MagicOrange

full-time

Posted on:

Location Type: Hybrid

Location: DurbanSouth Africa

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Design, develop, and maintain robust ETL / ELT pipelines across cloud-based data platforms.
  • Build and optimise data lakes and lakehouse architectures (Medallion approach) using Azure and Databricks.
  • Develop scalable, reusable, and metadata-driven data processing frameworks.
  • Ensure data pipelines are reliable, maintainable, and production-ready across dev, test, and production environments.
  • Design and implement dimensional and analytical data models to support reporting, analytics, and AI use cases.
  • Optimise data processing and query performance across large-scale datasets.
  • Implement indexing, partitioning, and distribution strategies to ensure high-performance data access.
  • Continuously monitor and improve data platform efficiency and cost usage.
  • Implement best practices for data quality, validation, and observability.
  • Support data governance initiatives, including metadata management, lineage, and data compliance.
  • Ensure data platforms align with security, privacy, and regulatory requirements (including GDPR).
  • Collaborate with DevOps and security teams to ensure secure and compliant data environments.
  • Work closely with data science and AI teams to enable model training, deployment, and experimentation.
  • Support Databricks-based analytics, streaming, and AI workloads.
  • Translate business and product requirements into scalable data solutions.
  • Provide technical guidance and mentorship to other data engineers.

Requirements

  • 8+ years of professional experience in data engineering or data warehousing, with demonstrated operation at senior or lead level in production environments.
  • Proven experience designing, building, and owning end-to-end data platforms (ETL / ELT, data lakes or lakehouse architectures) in the cloud, preferably Microsoft Azure.
  • Strong hands-on experience with SQL and modern data processing frameworks (e.g. Databricks, Spark, PySpark) to deliver scalable, high-performance data solutions.
  • Demonstrated ability to make architecture, performance, and cost optimisation decisions, while collaborating closely with engineering, DevOps, and data science teams.
  • Matric (Grade 12 or equivalent)
  • Relevant Bachelor’s degree or Diploma in a related field such as: Computer Science Information Systems Data Science / Business Analytics Engineering or similar technical discipline
  • Relevant professional certifications in cloud platforms or data engineering (e.g. Azure, Databricks) are advantageous.
Benefits
  • A strong entrepreneurial spirit, with the ability to make a real impact and see the results of your efforts
  • Ongoing training and exposure to the latest cloud, data, and AI technologies
  • The opportunity to work in a high-growth industry on a globally recognised SaaS product
  • A challenging and rewarding career in an innovative, forward-thinking company
  • The ability to influence outcomes, working in an open, collaborative environment close to decision-makers
  • A competitive remuneration package, including flexible pension options
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
ETLELTdata lakeslakehouse architecturesSQLDatabricksSparkPySparkdata processing frameworksdata modeling
Soft Skills
collaborationmentorshiptechnical guidanceproblem-solvingcommunicationleadershipdecision-makingoptimisationdata governancedata quality
Certifications
Azure certificationDatabricks certification