
Data Engineer
Movida Aluguel de Carros
contract
Posted on:
Location Type: Hybrid
Location: Mogi das Cruzes • Brasil
Visit company websiteExplore more
Tech Stack
About the role
- We are seeking a Senior Data Engineer to join our technology and analytics team.
- This professional will be responsible for designing, developing, and maintaining robust and scalable data pipelines, with a focus on producing strategic reports and dashboards.
- The role involves integrating multiple data sources — such as Excel, SAP, Oracle, and Google BigQuery — and working closely with business areas to turn data into valuable insights.
- Data Pipeline Development: Create and optimize pipelines for data ingestion, transformation, and loading using cloud computing, big data, machine learning, and generative AI technologies.
- Integration of Diverse Sources: Lead the integration of data from transactional systems (SAP ECC/S/4HANA, Oracle EBS/Database) and data warehouses into analytical platforms such as Google BigQuery and Fabric OneLake.
- Data Modeling: Develop efficient and scalable data models aligned with reporting, dashboards, and advanced analytics needs.
- Analytics and Visualization: Translate business requirements into technical solutions, ensuring accurate and up-to-date data for reports and Power BI dashboards.
- Quality and Governance: Implement data validation and governance processes to ensure consistency, compliance, and comprehensive technical documentation.
- Performance Optimization: Identify and implement technical and operational improvements, with a focus on accounting and financial modeling, and drive analytics initiatives.
- Cross-functional Collaboration: Partner with multiple areas across the company to gather requirements and propose innovative solutions.
Requirements
- Experience as a Data Engineer with a focus on analytics, reporting, and systems integration.
- SAP: Experience extracting and integrating data (ECC, S/4HANA, BW/BPC).
- Oracle: Proficiency with Oracle Database, including PL/SQL and query optimization.
- Google BigQuery: Experience designing tables, partitioning, clustering, and query optimization.
- Languages: Proficiency in SQL and Python for automation and pipeline development.
- ETL/ELT Tools: Knowledge of tools such as Apache Airflow, Dataflow, Matillion, DataFactory, Talend, Informatica, among others.
- Cloud: Strong experience with Google Cloud Platform (GCP) — Composer, Dataflow, Cloud Storage, Pub/Sub. Knowledge of AWS or Azure is a plus.
- Data Modeling: Strong expertise in dimensional modeling (Star Schema, Snowflake) and experience with lakehouse architectures.
- Must have easy access to the Mogi das Cruzes region.
- Work model: Hybrid
Benefits
- Work model: Hybrid — 3 days per week onsite and 2 days remote.
- Location: Mogi das Cruzes (Brás Cubas)
- Contract type: PJ (B2B/contractor)
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data engineeringdata pipeline developmentdata modelingSQLPythonETLELTdimensional modelingcloud computingmachine learning
Soft Skills
cross-functional collaborationcommunicationproblem-solvinganalytical thinkinginnovation