PwC

Data Engineer – Manager

PwC

full-time

Posted on:

Location Type: Office

Location: Wellington • 🇳🇿 New Zealand

Visit company website
AI Apply
Apply

Job Level

Mid-LevelSenior

Tech Stack

AWSAzureCloudETLGoogle Cloud PlatformInformaticaPythonSparkSQLTerraform

About the role

  • Monitor and improve data quality by designing and implementing automated validation rules, performing ongoing data profiling, and executing data cleansing routines to ensure accuracy, completeness, and reliability.
  • Develop, maintain, and continuously refine comprehensive documentation covering data pipelines, data models, solution architecture, and end-to-end data flows to support transparency, scalability, and knowledge sharing.
  • Diagnose, troubleshoot, and resolve complex data pipeline failures, integration issues, and system performance bottlenecks, ensuring high availability and optimal performance across data platforms.
  • Proactively evaluate and adopt emerging Azure data services, industry best practices, and modern engineering patterns to enhance platform capability, security, and efficiency.
  • Collaborate closely with business stakeholders, data analysts, engineers, and product teams to gather requirements, translate them into technical solutions, and deliver robust, scalable, and high-quality data products.
  • Implement CI/CD practices and Infrastructure as Code (e.g., Terraform/Bicep) for deploying and maintaining scalable data infrastructure.
  • Design and optimise data pipelines using Azure services such as Data Factory, Databricks, Synapse, or Event Hub.
  • Ensure data governance by enforcing metadata standards, lineage tracking, and access controls.

Requirements

  • Minimum 5 years’ experience as a Data Engineer with a strong focus on Azure cloud data services, ideally within a professional services or consultancy environment.
  • Hands-on experience with Data Governance and Management tooling (for example, Informatica, Collibra, etc), including data integration, workflow orchestration, and performance optimisation.
  • Proven ability to design, build, and optimise scalable, secure data pipelines using Azure Data Factory and complementary Azure services such as Databricks, Synapse Analytics, Data Lake Storage, and Event Hub.
  • Strong documentation practices, with the ability to produce clear and maintainable technical documentation covering data processes, architectures, models, and data flows.
  • Demonstrated experience implementing data governance and metadata management frameworks using Microsoft Purview, ensuring data quality, compliance, lineage, and security controls.
  • Experience leading and executing data migration initiatives, ensuring data integrity, consistency, and reconciliation across legacy systems and Azure cloud platforms.
  • Strong capability in establishing and managing data quality processes, including automated validation rules, profiling, monitoring, and data cleansing to maintain trustworthy data assets.
  • Familiarity with AWS and Google Cloud Platform (GCP), enabling support for hybrid or multi-cloud architectures where required.
  • Proficiency in SQL, Python, or Spark for data transformation and complex logic development.
  • Experience with DevOps practices and Infrastructure as Code (e.g., Terraform, Bicep, GitHub Actions, Azure DevOps).
  • Understanding of data modelling (dimensional & relational) and modern data architectures such as data lakehouse, event-driven, or ETL-based patterns.
Benefits
  • Life & income protection
  • Sonder/EAP/Headspace
  • 15 days’ paid sick leave
  • Group-rate health insurance (eligibility applies)
  • Flexible working
  • Supportive coaching culture
  • Purchase up to two extra weeks’ annual leave
  • Two recognition days provided each year
  • Annual summer shutdown period
  • Paid parental leave for all parents with flexible options and financial planning support
  • Inclusive networks
  • Paid volunteering leave
  • Discretionary bonus opportunities
  • Generous referral bonuses
  • Retail discounts & deals

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data engineeringdata governancedata integrationdata profilingdata cleansingSQLPythonSparkTerraformBicep
Soft skills
collaborationdocumentationtroubleshootingproblem-solvingcommunicationleadershiporganizational skillsattention to detailanalytical thinkingstakeholder management