GFT Technologies

Engenheiro de Dados Pl/Sr

GFT Technologies

full-time

Posted on:

Location Type: Hybrid

Location: BarueriBrazil

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Act as a technical reference in Azure data engineering, guiding the team on implementation standards, troubleshooting, and best practices;
  • Design, develop, and maintain data pipelines in Azure Data Factory and Databricks, ensuring resilient execution, reprocessing, and traceability;
  • Implement and evolve Lakehouse architecture with Delta Lake and Bronze, Silver, Gold layers;
  • Develop transformations in PySpark and SQL, focusing on performance, parallelism, partitioning, and cost reduction;
  • Implement data quality patterns: validations, reconciliation, deduplication, handling inconsistencies, and audit trails;
  • Design and implement integrations with corporate sources (relational databases and APIs) and support incremental ingestion and CDC where applicable;
  • Define and oversee data governance, security, and compliance standards, including access control and LGPD compliance;
  • Support the team in technical reviews, performance optimization, managing complex incidents, and root cause analysis;
  • Lead technical meetings, presentations, and workshops with the client in English;
  • Ensure adherence to data engineering best practices, observability, and production operations (logs, metrics, alerts, and SLAs).

Requirements

  • Advanced or fluent English to work in an international environment;
  • Strong experience as a Data Engineer on Azure;
  • Proven production experience with Databricks (Jobs/Workflows, clusters, policies, repos and secrets);
  • Proficiency in Apache Spark (PySpark) and Python for distributed processing;
  • Advanced SQL for transformation, optimization, and analytical modeling;
  • Experience with Delta Lake (MERGE, schema evolution/enforcement, OPTIMIZE/ZORDER, VACUUM and best practices for writing/reading);
  • Proficiency with Azure Data Lake Storage Gen2 (ADLS) and organizing data by domain and layers;
  • Experience with Azure Data Factory (pipelines, triggers, parameterization, integrations and monitoring);
  • Experience with Git and CI/CD applied to data pipelines and environment promotion;
  • Solid knowledge of security (RBAC, Key Vault, identities) and auditing/compliance practices.
  • Preferred: Experience with Unity Catalog (governance, permissions and lineage);
  • Knowledge of Delta Live Tables and Auto Loader;
  • Experience with IaC (Terraform, Bicep/ARM);
  • Experience with pipeline observability (Azure Monitor, Log Analytics, alerts, and runbooks);
  • Experience with Purview (catalog and governance);
  • Azure certifications (AZ-900, DP-900, DP-203) and/or Databricks certifications (Data Engineer Associate/Professional);
  • Experience in regulated industries (financial, insurance) with strong audit and compliance requirements.
Benefits
  • Multi-benefit card – you choose how and where to use it.
  • Study grants for undergraduate, postgraduate, MBA and language courses.
  • Certification incentive programs.
  • Flexible working hours.
  • Competitive salaries.
  • Annual performance review with a structured career plan.
  • Opportunity for international career development.
  • Wellhub and TotalPass.
  • Private pension plan.
  • Childcare assistance.
  • Health insurance.
  • Dental insurance.
  • Life insurance.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
Azure data engineeringdata pipelinesAzure Data FactoryDatabricksLakehouse architectureDelta LakePySparkSQLdata quality patternsdata governance
Soft Skills
technical referencetroubleshootingbest practicesperformance optimizationroot cause analysisleadershipcommunicationpresentation skillsteam collaborationclient engagement
Certifications
AZ-900DP-900DP-203Data Engineer AssociateData Engineer Professional