
Explore more
About the role
- Manage infrastructure as code (IaC) with Terraform, provisioning and configuring AWS resources (ECS, EMR, Lambda, Redshift, MSK, S3) and using the team's proprietary Terraform provider
- Operate and evolve the Data Lake with Raw, Processed and Refined zones, including deduplication, cataloging and storage optimization (Parquet, Iceberg)
- Administer and monitor Kafka clusters — topic creation, connectors, ACLs, credentials and tracking consumer lag for real-time streaming pipelines
- Maintain and evolve the Platform API, which exposes Data Loader, File Loader, Static Loader, Data Streaming and Business Metrics features for self-service consumption by other teams
- Investigate and resolve pipeline incidents (failed DAGs, dataset desynchronization, data duplication, Redshift/Spark issues), addressing root causes
- Participate in modernization initiatives and integration with AI tools (Claude Code, MCP Servers)
- Contribute to technical documentation and maintain observability tooling
Requirements
- Solid experience (3+ years) in Data Engineering or related areas
- Proficiency in Python for building pipelines, automation scripts and integrations
- Hands-on experience with advanced SQL
- Knowledge of Apache Airflow
- Experience with AWS services: S3, Redshift, EMR (Spark), Lambda, ECS, MSK (Kafka)
- Knowledge of Apache Kafka: topics, producers/consumers, connectors (Debezium, S3 Sink)
- Experience with Terraform or another Infrastructure as Code tool
- Familiarity with Git and CI/CD workflows (Bitbucket Pipelines or similar)
- Knowledge of Data Lake architectures
- Good communication skills and ability to work autonomously in an agile team
- Nice-to-haves:
- Experience with Apache Spark (PySpark, SparkSQL)
- Knowledge of C# / .NET
- Familiarity with Debezium for Change Data Capture (CDC)
- Experience with modern table formats (Apache Iceberg, Hudi)
- Knowledge of Grafana for monitoring and operational dashboards
- Experience with OpsGenie/JSM for incident and alert management
- Familiarity with Redshift
- Technical English for reading documentation and communicating with LATAM teams
Benefits
- Bradesco National Health Plan — extended to dependents without beneficiary discount
- Bradesco Dental Plan — optional
- Flexible meal/food allowance (VR/VA) — maintained during vacation
- Profit sharing (PLR)
- Wellhub
- Birthday day off
- Home office allowance
- Commuter voucher (VT) as needed — legally permitted deductions
- Life insurance
- Free access to all our products — AppsClub, Discount Club, TrueCaller, BTFit and Busuu
- Access to internal training via digital platforms
- Internal employee recognition program — Bemobucks
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonSQLApache AirflowTerraformApache KafkaApache SparkData Lake architectureChange Data Capturemodern table formatsobservability tooling
Soft Skills
communication skillsautonomous workagile team collaboration