Skillfield

Spark Developer

Skillfield

contract

Posted on:

Location Type: Hybrid

Location: MelbourneAustralia

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design, develop, and maintain scalable ETL and ELT pipelines using Apache Spark (batch and streaming).
  • Implement data transformation logic aligned with business rules and data quality standards.
  • Optimise Spark workloads for performance, reliability, and cost efficiency, including partitioning, caching, executor configuration, shuffle tuning, and Spark SQL optimisation.
  • Build and manage data ingestion flows using Apache NiFi.
  • Support integration across cloud and hybrid environments, working with a range of data sources and platforms.
  • Develop supporting utilities, automation scripts, and microservices using Python and Go.
  • Partner with data architects on solution design, schemas, data models, and integration patterns.
  • Investigate and resolve pipeline failures, performance issues, data inconsistencies, and distributed system behaviours.
  • Ensure security, governance, and compliance requirements are met across data processes.
  • Contribute to shared standards, patterns, and continuous improvement across the data engineering practice.

Requirements

  • Hands‑on experience with Apache Spark (Scala or PySpark) for large‑scale data processing.
  • Experience building and managing data flows with Apache NiFi.
  • Proficiency in Python for ETL, automation, and data manipulation.
  • Working experience with Go for backend utilities or supporting services.
  • Understanding of distributed systems, cluster configuration, and performance tuning principles.
  • Experience working with cloud data platforms such as AWS, Azure, or GCP, including hybrid or on‑prem integrations.
  • Familiarity with CI/CD pipelines, Git‑based workflows, and modern engineering practices.
  • Exposure to containerisation and orchestration tools such as Docker and Kubernetes.
  • Understanding of data modelling, schema evolution, and data quality approaches.
Benefits
  • Flexibility, support, and recognition based on outcomes
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
Apache SparkETLELTPythonGoApache NiFidata transformationdata qualityperformance tuningdata modelling
Soft Skills
problem solvingcollaborationcommunicationcontinuous improvementattention to detail