GFT Technologies

Data Engineer – Azure, Databricks

GFT Technologies

full-time

Posted on:

Location Type: Hybrid

Location: AlphavilleBrazil

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design and develop scalable, robust data ingestion and transformation pipelines on Azure and Databricks;
  • Build solutions based on Lakehouse architecture, organized into bronze, silver and gold layers;
  • Ensure pipeline performance and scalability through best practices in partitioning, caching and parallelism;
  • Automate processes with a focus on reusability, monitoring and quality control;
  • Act as a technical reference for the data engineering team, promoting best practices and standardization.

Requirements

  • Experience with transactional databases and requirements analysis;
  • Knowledge of Data Governance;
  • Strong familiarity with Azure Data Lake Gen2, Azure Data Factory, Key Vault and integration across Azure services;
  • Proficiency in Spark (PySpark) and Python for developing pipelines and transforming data;
  • Experience with Git and CI/CD practices applied to data pipelines;
  • Advanced conversational English.
Benefits
  • Multi-benefits card – choose how and where to use it.
  • Scholarships for undergraduate, graduate, MBA and language courses.
  • Certification incentive programs.
  • Flexible working hours.
  • Competitive salaries.
  • Annual performance review with a structured career plan.
  • Possibility of international career opportunities.
  • Wellhub and TotalPass.
  • Private pension plan.
  • Childcare allowance.
  • Health insurance.
  • Dental care.
  • Life insurance.

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data ingestiondata transformationLakehouse architecturepartitioningcachingparallelismSparkPySparkPythonCI/CD
Soft skills
communicationtechnical referencebest practices promotionstandardization