GFT Technologies

Data Analyst II

GFT Technologies

full-time

Posted on:

Location Type: Remote

Location: Brazil

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design, build and maintain scalable, resilient data pipelines
  • Work with large volumes of structured and unstructured data
  • Ensure data quality, consistency and governance
  • Collaborate with software engineers, analysts and data scientists
  • Participate in technical decisions about data architecture and tooling

Requirements

  • Experience in data engineering within distributed environments
  • Proficiency in Apache Spark and Scala
  • Experience with AWS (Glue, S3, EMR, Athena, Redshift, etc.)
  • Knowledge of data modeling, ETL/ELT and data pipelines
  • Familiarity with relational databases and NoSQL
  • Experience with batch and streaming processing
  • Experience with Apache Kafka or Amazon MSK (nice-to-have)
  • Knowledge of legacy system modernization (nice-to-have)
  • Experience with Delta Lake, Apache Hudi or Iceberg (nice-to-have)
  • Familiarity with data CI/CD tools (dbt, Airflow, Terraform) (nice-to-have)
Benefits
  • Multi-benefits card – you choose how and where to use it.
  • Scholarships for Undergraduate, Graduate, MBA and language courses.
  • Certification incentive programs.
  • Flexible working hours.
  • Competitive salaries.
  • Annual performance evaluation with a structured career plan.
  • Opportunity for international career moves.
  • Wellhub and TotalPass.
  • Private pension plan.
  • Childcare assistance.
  • Health insurance.
  • Dental insurance.
  • Life insurance.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringApache SparkScalaAWSdata modelingETLELTbatch processingstreaming processingApache Kafka
Soft Skills
collaborationcommunicationtechnical decision-making