Tech Stack
AirflowBigQueryCloudETLGoogle Cloud PlatformPostgres
About the role
- Design, develop and maintain data pipelines using Dataflow on GCP.
- Integrate and transform data from different sources such as Cloud Storage, Pub/Sub, BigQuery and PostgreSQL.
- Ensure data quality, security and governance at all stages of the process.
- Collaborate with BI, Analytics and Engineering teams to deliver data-driven solutions.
- Optimize performance and cost of cloud solutions.
- Document processes and data engineering best practices.
Requirements
- GCP Experience: Proven experience with Google Cloud Platform, especially Dataflow, Cloud Storage and Pub/Sub.
- Databases: Knowledge of PostgreSQL and data modeling.
- ETL/ELT: Experience with ETL/ELT processes.
- Versioning and CI/CD: Familiarity with version control (Git) and CI/CD practices.
- Teamwork: Ability to work in agile, collaborative environments.
- English: Intermediate English for documentation and communication.
- BigQuery: Experience or knowledge of BigQuery.
- GCP Certifications: GCP certifications (e.g., Professional Data Engineer).
- Orchestration Tools: Experience with orchestration tools (e.g., Composer/Airflow).
- Data Operations: Knowledge of DataOps practices and data observability.
- Meal or food allowance;
- Discounts on courses, universities and language schools;
- Stefanini Academy — platform offering free, up-to-date online courses with certificates;
- Mentoring;
- Childcare assistance;
- Benefits club for medical consultations and exams;
- Medical assistance;
- Dental assistance;
- Discount club with benefits at partner establishments;
- Travel club;
- Pet care benefits;
- And much more...
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
DataflowCloud StoragePub/SubPostgreSQLETLELTBigQueryGitCI/CDData modeling
Soft skills
TeamworkCollaborationCommunication
Certifications
GCP certificationsProfessional Data Engineer