Domo Inovação

Mid-level Data Engineer

Domo Inovação

full-time

Posted on:

Location Type: Remote

Location: Brazil

Visit company website

Explore more

AI Apply
Apply

About the role

  • Actively support the definition of requirements for IT/Data infrastructure, contributing technical analysis of proposed solutions and alternatives and assessing impacts on performance, cost, scalability, and adherence to the existing architecture;
  • Work with moderate critical judgment, suggesting improvements and identifying technical risks, aligned with guidelines established by the senior team and corporate architecture;
  • Participate in the design and implementation of medium-complexity data architectures, following standards and guidelines defined by corporate architecture;
  • Collaborate with technical and non-technical teams, translating complex concepts into accessible language to facilitate alignment and decision-making;
  • Actively contribute to project success by anticipating needs, supporting other areas, and participating in the resolution of technical disagreements with maturity and a results-oriented focus;
  • Collaborate actively in defining and extracting data from different sources for ingestion into Big Data platforms, ensuring adherence to established standards;
  • Support the modeling and organization of data structures so they are appropriate, accessible, and available for consumption, ensuring alignment between operational data and the required analytical models;
  • Actively contribute to projects related to data and systems integration, implementing medium-complexity solutions and ensuring compliance with defined architectural standards;
  • Support the technical design of solutions, participate in the evolution of the existing architecture, and work on maintenance and continuous improvement of the systems under their responsibility.

Requirements

  • Bachelor’s degree in Computer Science, Information Systems, Systems Analysis and Development, or related fields;
  • Hands-on experience developing and maintaining data pipelines;
  • Strong knowledge of Python for ETL, data manipulation, and APIs (Pandas, NumPy, PySpark);
  • Practical experience with Spark and/or Kafka;
  • Knowledge of batch and streaming processing;
  • Familiarity with Data Lake, Lakehouse, and Data Warehouse concepts;
  • Proficiency in SQL and experience with relational databases;
  • Basic knowledge of NoSQL;
  • Familiarity with DataOps, versioning, and pipeline monitoring;
  • Practical application of agile methodologies (Scrum, Kanban);
  • Postgraduate degrees and certifications in IT;
  • Experience in regulated environments (e.g., the financial sector);
  • Experience with orchestration tools (e.g., Airflow);
  • Knowledge of integration with ML models or consumption of AI APIs;
  • Experience working in mid-sized environments and processing significant volumes of data.
Benefits
  • Health insurance and dental plan
  • Wellhub (Gympass)
  • Transportation allowance
  • Meal and food allowances
  • Domo School (learning platform) and partnerships with educational institutions
  • Private pension plan
  • Life insurance
  • Day off
  • Extended maternity and paternity leave
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
PythonETLdata manipulationAPIsPandasNumPyPySparkSQLNoSQLdata pipelines
Soft Skills
critical judgmentcollaborationcommunicationresults-oriented focustechnical analysisproblem-solvingmaturityanticipation of needsdecision-makingrisk identification
Certifications
Bachelor’s degree in Computer ScienceBachelor’s degree in Information SystemsBachelor’s degree in Systems Analysis and DevelopmentPostgraduate degrees in IT