GT

Data Engineer, Azure

GT

full-time

Posted on:

Origin:  • 🇪🇺 Anywhere in Europe

Visit company website
AI Apply
Manual Apply

Job Level

Mid-LevelSenior

Tech Stack

AzureERPETLPySparkPythonSQLTerraform

About the role

  • Own the Azure data platform architecture & roadmap (ADF vs Fabric; Synapse/Databricks evaluation)
  • Design, build, and operate ETL/ELT pipelines into ADLS/Warehouse
  • Model data for Power BI (DAX/Tabular)
  • Implement data quality, lineage, governance & security (Purview, RBAC, CI/CD)
  • Partner with BI analysts to deliver reusable, trusted semantic models and dashboards
  • Drive reliability & cost optimisation (monitoring, alerting, SLAs)
  • Support immediate projects: Business Central (ERP + MES), TrackWise (QMS), ECC6 extracts
  • Audit current estate, define migration plan, build ADF pipelines for priority sources
  • Establish baseline data quality checks and lineage and achieve >98% dataset refresh success
  • Deliver consolidated Lakehouse/Warehouse with governed semantic models optimised for cost & performance
  • Document controls and achieve stakeholder CSAT/NPS ≥8/10
  • Long-term ownership and growth as a trusted data leader in a global organisation

Requirements

  • 4-6 years of experience
  • Strong expertise in Azure Data Factory & Azure Data Lake Gen2
  • Advanced SQL/T-SQL
  • Power BI (DAX, Tabular modeling, deployment pipelines)
  • Python or PySpark
  • Git & Azure DevOps (CI/CD pipelines)
  • Dimensional modeling
  • Security & RBAC
  • Experience with Synapse, Databricks, and Delta Lake (nice-to-have)
  • Knowledge of Microsoft Purview, IaC (Bicep/Terraform) (nice-to-have)
  • Familiarity with ML basics (nice-to-have)
  • Background in regulated manufacturing/pharma (GxP) — can be learned
  • Strong communication & collaboration skills
  • Pragmatic “architect-builder” mindset
  • Ability to lead technology choices and engage stakeholders
  • Results-driven with focus on data reliability, governance, and business value