UltraCon Consultoria

Data & Analytics Engineer

UltraCon Consultoria

full-time

Posted on:

Location Type: Hybrid

Location: São Paulo • 🇧🇷 Brazil

Visit company website
AI Apply
Apply

Job Level

Mid-LevelSenior

Tech Stack

AzureCloudETLGoogle Cloud PlatformOraclePySparkPythonScalaSparkSQLSSIS

About the role

  • Develop and maintain data ingestion, transformation and loading pipelines (ETL/ELT) from multiple sources (APIs, relational databases, files, legacy systems and cloud)
  • Work on data modeling (Data Lake, Data Warehouse and Data Marts), ensuring performance, scalability and governance
  • Process large volumes of data using Spark (PySpark or Scala) and Big Data frameworks
  • Develop scripts and automations in Python for data extraction, integration and processing
  • Create and optimize advanced SQL queries (procedures, views, functions, jobs), with a focus on performance
  • Work in cloud environments (GCP, Oracle Cloud, Azure or similar), participating in data migration and modernization projects
  • Integrate data across different platforms and systems (e.g., CRM, marketing, surveys, corporate systems)
  • Support the development of dashboards and reports in Power BI, ensuring reliable data for analysis and decision-making
  • Collaborate with business, analytics and technology teams to understand requirements and translate them into data solutions
  • Document processes, models and data pipelines, following engineering and governance best practices

Requirements

  • Proven experience as a Data Engineer or in similar roles
  • Advanced SQL proficiency, including query optimization and database objects
  • Strong experience in ETL/ELT using tools such as PowerCenter, Pentaho, Databricks, SSIS or equivalents
  • Practical knowledge of Python for data engineering and automation
  • Experience with distributed processing (Spark – PySpark or Scala)
  • Experience in cloud environments (Google Cloud, Oracle Cloud, Azure or similar)
  • Experience with Data Lakes, Data Warehouses and data modeling
  • Knowledge of data integration via APIs, files and secure transfer (SFTP)
  • Ability to communicate with technical and business stakeholders
Benefits
  • Remote work
  • Join a high-performance team
  • Opportunities for professional development

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
ETLELTdata modelingSQLPythonSparkPySparkScaladata integrationdata processing
Soft skills
communicationcollaboration