Publicar Vaga
Afiliados
❌ Vagas que Você Ocultou
⭐️ Vagas Salvas
✅ Vagas Candidatadas
Conta
Remotebase
Site
LinkedIn
Todas as Vagas
React.js
Node.js
Desenvolvimento JavaScript
Desenvolvimento mobile
Desenvolvimento web
A Remotebase é uma plataforma que conecta empresas a desenvolvedores de software de elite, previamente avaliados, em todo o mundo. A empresa oferece uma base diversificada de profissionais com habilidades em áreas como Python, JavaScript, IA, visão computacional e mais, rigorosamente avaliados quanto à expertise e à confiabilidade.
A Remotebase otimiza o processo de contratação ao apresentar uma seleção criteriosa de candidatos, permitindo que as empresas economizem tempo e foquem nas necessidades críticas do projeto. A plataforma facilita encontrar e contratar talentos de ponta para iniciativas de tecnologia por meio de um processo simples em três etapas: compartilhar as necessidades, fazer o matching com desenvolvedores e conduzir entrevistas para escolher a melhor opção.
51 - 200 funcionários
Fundada em 2020
👥 RH Tech
🎯 Recrutamento
☁️ SaaS
Data Engineer
19 minutos atrás
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
AWS
Azure
Cloud
Docker
ETL
Google Cloud Platform
Jenkins
Kubernetes
Python
SQL
Terraform
Candidatar-se
Check Resume Score
☆ Salvar
☑️ Marcar como candidatado
❌ Ocultar
Reportar problema
Descrição
Design, build, and maintain scalable and resilient CI/CD pipelines for data applications and infrastructure, with a focus on Snowflake, dbt, and related data tools.
Implement and manage Snowflake dbt projects for data transformation, including developing dbt models, tests, and documentation, and integrating dbt into CI/CD workflows.
Develop and manage infrastructure as code (IaC) using Terraform to provision and configure cloud resources for data storage, processing, and analytics on GCP.
Automate the deployment, monitoring, and management of Snowflake data warehouse environments, ensuring optimal performance, security, and cost-effectiveness.
Collaborate with data engineers and data scientists to understand their requirements and provide robust, automated solutions for data ingestion, processing, and delivery.
Implement and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to ensure high availability and proactive issue resolution.
Develop and maintain robust automation scripts and tools, primarily using Python, to streamline operational tasks, manage data pipelines, and improve efficiency; Bash scripting for system-level tasks is also required.
Ensure security best practices are implemented and maintained across the data infrastructure and pipelines.
Troubleshoot and resolve issues related to data infrastructure, pipelines, and deployments in a timely manner.
Participate in code reviews for infrastructure code, dbt models, and automation scripts.
Document system architectures, configurations, and operational procedures.
Stay current with emerging DevOps technologies, data engineering tools, and cloud best practices, particularly related to Snowflake, dbt, and Terraform.
Optimize data pipelines for performance, scalability, and cost.
Support and contribute to data governance and data quality initiatives from an operational perspective.
Help implement AI features
🎯 Requisitos
Bachelor's degree in Computer Science, Engineering, or a related technical field expertise which is equivalent.
5+ years of hands-on experience in a DevOps, SRE, or infrastructure engineering role.
3+ years of experience specifically focused on automating and managing data infrastructure and pipelines.
1+ years of experience enabling AI features
**Others:**
Strong, demonstrable experience with Infrastructure as Code tools, particularly Terraform.
Strong background in DevOps principles and practices, and hands-on experience in building business intelligence solutions.
Highly experienced in automation, and problem-solving skills with proficiency in cloud technologies.
Ability to collaborate effectively with data engineers, analysts, and other stakeholders to ensure the reliability and performance of our data ecosystem.
Proven experience with dbt for data transformation, including developing models, tests, and managing dbt projects in a production environment.
Hands-on experience managing and optimizing Snowflake data warehouse environments.
Demonstrable experience with data modeling techniques for ODS, dimensional modeling (Facts, Dimensions), and semantic models for analytics and BI.
Strong proficiency in Python for automation, scripting, and data-related tasks. Experience with relevant Python libraries is a plus. Strong Bash scripting.
Solid understanding of CI/CD principles and tools (e.g., Bitbucket Runners, Jenkins, GitLab CI, GitHub Actions, Azure DevOps).
Experience with cloud platforms (GCP preferred, AWS, or Azure) and their data services.
Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus.
Knowledge of data integration tools and ETL/ELT concepts.
Familiarity with monitoring and logging tools.
Strong SQL skills.
Ability to work independently and as part of a collaborative team in an agile environment.
Strong communication skills, with the ability to explain complex technical concepts clearly.
🏖️ Benefícios
Fully remote with office optional. You decide when you would like to work from home and when from the office.
Flexible timings. You decide your work scheduled.
Market competitive compensation (in $$).
Insane learning and growth
Candidatar-se
📊 Pontuação do Currículo
Envie seu currículo para ver se ele passa pelas ferramentas de rejeição automática usadas por recrutadores
Check Resume Score
Vagas Similares
Engenheiro de Dados GCP Sênior
13 horas atrás
Leega
201 - 500
Site
LinkedIn
Todas as Vagas
Projetar e manter pipelines de dados em GCP/AWS para a Leega Consultoria. Construir modelos de dados, processos ETL/ELT e garantir governança para análises.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
AWS
Cloud
ETL
Google Cloud Platform
NoSQL
SQL
Tableau
Engenheiro de Dados Sênior (SAS e GCP)
13 horas atrás
Leega
201 - 500
Site
LinkedIn
Todas as Vagas
Engenheiro de Dados Sênior liderando migrações de SAS para GCP na Leega, consultoria de analytics. Projetar modelos no BigQuery, construir pipelines ETL e otimizar infraestrutura em nuvem.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
Apache
BigQuery
Cloud
ETL
Google Cloud Platform
Hadoop
Java
PySpark
Python
Scala
Shell Scripting
Spark
SQL
Terraform
Engenheiro de Dados
15 horas atrás
FCamara Consulting & Training
1001 - 5000
Site
LinkedIn
Todas as Vagas
Engenheiro de Dados construindo e mantendo infraestrutura de dados no GCP para a maior construtora do Brasil. Responsável por ETL, BigQuery, modelagem de dados e CI/CD.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
BigQuery
Cloud
ETL
Google Cloud Platform
Python
Spark
SQL
SSIS
Terraform
Engenheiro de Dados
15 horas atrás
FCamara Consulting & Training
1001 - 5000
Site
LinkedIn
Todas as Vagas
Construir e manter pipelines de dados e data lakes baseados em GCP para uma importante empresa de construção da América Latina. Garantir qualidade dos dados, segurança (LGPD) e CI/CD dos produtos analíticos.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
BigQuery
Cloud
ETL
Google Cloud Platform
Python
Spark
SQL
SSIS
Terraform
Engenheiro de Dados e/ou Arquiteto de Dados
16 horas atrás
Logiks TI
201 - 500
Site
LinkedIn
Todas as Vagas
Construir e manter Data Lakes AWS, pipelines de ETL e arquiteturas de Data Warehouse para consultoria de TI brasileira. Utilizar Python/PySpark, Airflow e serviços analíticos da AWS (S3, Redshift, Glue, Athena).
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
Airflow
Amazon Redshift
AWS
Cloud
PySpark
Python
Ver Mais Vagas de Engenheiro de Dados
Desenvolvido por Lior Neu-ner. Adoraria receber seu feedback — entre em contato por DM ou pelo e-mail support@remoterocketship.com
Buscar
Buscar vagas por cargo
Buscar vagas por stack tecnológico
Vagas remotas tempo-integral
Vagas remotas meio-periodo
Vagas remotas contrato-temporario
Vagas remotas estagio
Vagas Populares
Vagas remotas de analista de dados
Vagas remotas de DevOps
Vagas remotas de assistente executivo
Vagas remotas de designer de produto
Vagas remotas de engenheiro de QA
Vagas remotas de recrutador
Vagas remotas de vendas
Vagas remotas de SDR
Vagas remotas de engenheiro de software
Vagas remotas de pesquisador UX
Recursos
Publicar vaga
Afiliados
Política de privacidade e termos de serviço
Requirements
Publicar Vaga
Afiliados
❌ Vagas que Você Ocultou
⭐️ Vagas Salvas
✅ Vagas Candidatadas
Conta
Remotebase
Site
LinkedIn
Todas as Vagas
React.js
Node.js
Desenvolvimento JavaScript
Desenvolvimento mobile
Desenvolvimento web
A Remotebase é uma plataforma que conecta empresas a desenvolvedores de software de elite, previamente avaliados, em todo o mundo. A empresa oferece uma base diversificada de profissionais com habilidades em áreas como Python, JavaScript, IA, visão computacional e mais, rigorosamente avaliados quanto à expertise e à confiabilidade.
A Remotebase otimiza o processo de contratação ao apresentar uma seleção criteriosa de candidatos, permitindo que as empresas economizem tempo e foquem nas necessidades críticas do projeto. A plataforma facilita encontrar e contratar talentos de ponta para iniciativas de tecnologia por meio de um processo simples em três etapas: compartilhar as necessidades, fazer o matching com desenvolvedores e conduzir entrevistas para escolher a melhor opção.
51 - 200 funcionários
Fundada em 2020
👥 RH Tech
🎯 Recrutamento
☁️ SaaS
Data Engineer
19 minutos atrás
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
AWS
Azure
Cloud
Docker
ETL
Google Cloud Platform
Jenkins
Kubernetes
Python
SQL
Terraform
Candidatar-se
Check Resume Score
☆ Salvar
☑️ Marcar como candidatado
❌ Ocultar
Reportar problema
Descrição
Design, build, and maintain scalable and resilient CI/CD pipelines for data applications and infrastructure, with a focus on Snowflake, dbt, and related data tools.
Implement and manage Snowflake dbt projects for data transformation, including developing dbt models, tests, and documentation, and integrating dbt into CI/CD workflows.
Develop and manage infrastructure as code (IaC) using Terraform to provision and configure cloud resources for data storage, processing, and analytics on GCP.
Automate the deployment, monitoring, and management of Snowflake data warehouse environments, ensuring optimal performance, security, and cost-effectiveness.
Collaborate with data engineers and data scientists to understand their requirements and provide robust, automated solutions for data ingestion, processing, and delivery.
Implement and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to ensure high availability and proactive issue resolution.
Develop and maintain robust automation scripts and tools, primarily using Python, to streamline operational tasks, manage data pipelines, and improve efficiency; Bash scripting for system-level tasks is also required.
Ensure security best practices are implemented and maintained across the data infrastructure and pipelines.
Troubleshoot and resolve issues related to data infrastructure, pipelines, and deployments in a timely manner.
Participate in code reviews for infrastructure code, dbt models, and automation scripts.
Document system architectures, configurations, and operational procedures.
Stay current with emerging DevOps technologies, data engineering tools, and cloud best practices, particularly related to Snowflake, dbt, and Terraform.
Optimize data pipelines for performance, scalability, and cost.
Support and contribute to data governance and data quality initiatives from an operational perspective.
Help implement AI features
🎯 Requisitos
Bachelor's degree in Computer Science, Engineering, or a related technical field expertise which is equivalent.
5+ years of hands-on experience in a DevOps, SRE, or infrastructure engineering role.
3+ years of experience specifically focused on automating and managing data infrastructure and pipelines.
1+ years of experience enabling AI features
**Others:**
Strong, demonstrable experience with Infrastructure as Code tools, particularly Terraform.
Strong background in DevOps principles and practices, and hands-on experience in building business intelligence solutions.
Highly experienced in automation, and problem-solving skills with proficiency in cloud technologies.
Ability to collaborate effectively with data engineers, analysts, and other stakeholders to ensure the reliability and performance of our data ecosystem.
Proven experience with dbt for data transformation, including developing models, tests, and managing dbt projects in a production environment.
Hands-on experience managing and optimizing Snowflake data warehouse environments.
Demonstrable experience with data modeling techniques for ODS, dimensional modeling (Facts, Dimensions), and semantic models for analytics and BI.
Strong proficiency in Python for automation, scripting, and data-related tasks. Experience with relevant Python libraries is a plus. Strong Bash scripting.
Solid understanding of CI/CD principles and tools (e.g., Bitbucket Runners, Jenkins, GitLab CI, GitHub Actions, Azure DevOps).
Experience with cloud platforms (GCP preferred, AWS, or Azure) and their data services.
Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus.
Knowledge of data integration tools and ETL/ELT concepts.
Familiarity with monitoring and logging tools.
Strong SQL skills.
Ability to work independently and as part of a collaborative team in an agile environment.
Strong communication skills, with the ability to explain complex technical concepts clearly.
🏖️ Benefícios
Fully remote with office optional. You decide when you would like to work from home and when from the office.
Flexible timings. You decide your work scheduled.
Market competitive compensation (in $$).
Insane learning and growth
Candidatar-se
📊 Pontuação do Currículo
Envie seu currículo para ver se ele passa pelas ferramentas de rejeição automática usadas por recrutadores
Check Resume Score
Vagas Similares
Engenheiro de Dados GCP Sênior
13 horas atrás
Leega
201 - 500
Site
LinkedIn
Todas as Vagas
Projetar e manter pipelines de dados em GCP/AWS para a Leega Consultoria. Construir modelos de dados, processos ETL/ELT e garantir governança para análises.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
AWS
Cloud
ETL
Google Cloud Platform
NoSQL
SQL
Tableau
Engenheiro de Dados Sênior (SAS e GCP)
13 horas atrás
Leega
201 - 500
Site
LinkedIn
Todas as Vagas
Engenheiro de Dados Sênior liderando migrações de SAS para GCP na Leega, consultoria de analytics. Projetar modelos no BigQuery, construir pipelines ETL e otimizar infraestrutura em nuvem.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
Apache
BigQuery
Cloud
ETL
Google Cloud Platform
Hadoop
Java
PySpark
Python
Scala
Shell Scripting
Spark
SQL
Terraform
Engenheiro de Dados
15 horas atrás
FCamara Consulting & Training
1001 - 5000
Site
LinkedIn
Todas as Vagas
Engenheiro de Dados construindo e mantendo infraestrutura de dados no GCP para a maior construtora do Brasil. Responsável por ETL, BigQuery, modelagem de dados e CI/CD.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
BigQuery
Cloud
ETL
Google Cloud Platform
Python
Spark
SQL
SSIS
Terraform
Engenheiro de Dados
15 horas atrás
FCamara Consulting & Training
1001 - 5000
Site
LinkedIn
Todas as Vagas
Construir e manter pipelines de dados e data lakes baseados em GCP para uma importante empresa de construção da América Latina. Garantir qualidade dos dados, segurança (LGPD) e CI/CD dos produtos analíticos.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
BigQuery
Cloud
ETL
Google Cloud Platform
Python
Spark
SQL
SSIS
Terraform
Engenheiro de Dados e/ou Arquiteto de Dados
16 horas atrás
Logiks TI
201 - 500
Site
LinkedIn
Todas as Vagas
Construir e manter Data Lakes AWS, pipelines de ETL e arquiteturas de Data Warehouse para consultoria de TI brasileira. Utilizar Python/PySpark, Airflow e serviços analíticos da AWS (S3, Redshift, Glue, Athena).
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
Airflow
Amazon Redshift
AWS
Cloud
PySpark
Python
Ver Mais Vagas de Engenheiro de Dados
Desenvolvido por Lior Neu-ner. Adoraria receber seu feedback — entre em contato por DM ou pelo e-mail support@remoterocketship.com
Buscar
Buscar vagas por cargo
Buscar vagas por stack tecnológico
Vagas remotas tempo-integral
Vagas remotas meio-periodo
Vagas remotas contrato-temporario
Vagas remotas estagio
Vagas Populares
Vagas remotas de analista de dados
Vagas remotas de DevOps
Vagas remotas de assistente executivo
Vagas remotas de designer de produto
Vagas remotas de engenheiro de QA
Vagas remotas de recrutador
Vagas remotas de vendas
Vagas remotas de SDR
Vagas remotas de engenheiro de software
Vagas remotas de pesquisador UX
Recursos
Publicar vaga
Afiliados
Política de privacidade e termos de serviço
Benefits
Publicar Vaga
Afiliados
❌ Vagas que Você Ocultou
⭐️ Vagas Salvas
✅ Vagas Candidatadas
Conta
Remotebase
Site
LinkedIn
Todas as Vagas
React.js
Node.js
Desenvolvimento JavaScript
Desenvolvimento mobile
Desenvolvimento web
A Remotebase é uma plataforma que conecta empresas a desenvolvedores de software de elite, previamente avaliados, em todo o mundo. A empresa oferece uma base diversificada de profissionais com habilidades em áreas como Python, JavaScript, IA, visão computacional e mais, rigorosamente avaliados quanto à expertise e à confiabilidade.
A Remotebase otimiza o processo de contratação ao apresentar uma seleção criteriosa de candidatos, permitindo que as empresas economizem tempo e foquem nas necessidades críticas do projeto. A plataforma facilita encontrar e contratar talentos de ponta para iniciativas de tecnologia por meio de um processo simples em três etapas: compartilhar as necessidades, fazer o matching com desenvolvedores e conduzir entrevistas para escolher a melhor opção.
51 - 200 funcionários
Fundada em 2020
👥 RH Tech
🎯 Recrutamento
☁️ SaaS
Data Engineer
19 minutos atrás
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
AWS
Azure
Cloud
Docker
ETL
Google Cloud Platform
Jenkins
Kubernetes
Python
SQL
Terraform
Candidatar-se
Check Resume Score
☆ Salvar
☑️ Marcar como candidatado
❌ Ocultar
Reportar problema
Descrição
Design, build, and maintain scalable and resilient CI/CD pipelines for data applications and infrastructure, with a focus on Snowflake, dbt, and related data tools.
Implement and manage Snowflake dbt projects for data transformation, including developing dbt models, tests, and documentation, and integrating dbt into CI/CD workflows.
Develop and manage infrastructure as code (IaC) using Terraform to provision and configure cloud resources for data storage, processing, and analytics on GCP.
Automate the deployment, monitoring, and management of Snowflake data warehouse environments, ensuring optimal performance, security, and cost-effectiveness.
Collaborate with data engineers and data scientists to understand their requirements and provide robust, automated solutions for data ingestion, processing, and delivery.
Implement and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to ensure high availability and proactive issue resolution.
Develop and maintain robust automation scripts and tools, primarily using Python, to streamline operational tasks, manage data pipelines, and improve efficiency; Bash scripting for system-level tasks is also required.
Ensure security best practices are implemented and maintained across the data infrastructure and pipelines.
Troubleshoot and resolve issues related to data infrastructure, pipelines, and deployments in a timely manner.
Participate in code reviews for infrastructure code, dbt models, and automation scripts.
Document system architectures, configurations, and operational procedures.
Stay current with emerging DevOps technologies, data engineering tools, and cloud best practices, particularly related to Snowflake, dbt, and Terraform.
Optimize data pipelines for performance, scalability, and cost.
Support and contribute to data governance and data quality initiatives from an operational perspective.
Help implement AI features
🎯 Requisitos
Bachelor's degree in Computer Science, Engineering, or a related technical field expertise which is equivalent.
5+ years of hands-on experience in a DevOps, SRE, or infrastructure engineering role.
3+ years of experience specifically focused on automating and managing data infrastructure and pipelines.
1+ years of experience enabling AI features
**Others:**
Strong, demonstrable experience with Infrastructure as Code tools, particularly Terraform.
Strong background in DevOps principles and practices, and hands-on experience in building business intelligence solutions.
Highly experienced in automation, and problem-solving skills with proficiency in cloud technologies.
Ability to collaborate effectively with data engineers, analysts, and other stakeholders to ensure the reliability and performance of our data ecosystem.
Proven experience with dbt for data transformation, including developing models, tests, and managing dbt projects in a production environment.
Hands-on experience managing and optimizing Snowflake data warehouse environments.
Demonstrable experience with data modeling techniques for ODS, dimensional modeling (Facts, Dimensions), and semantic models for analytics and BI.
Strong proficiency in Python for automation, scripting, and data-related tasks. Experience with relevant Python libraries is a plus. Strong Bash scripting.
Solid understanding of CI/CD principles and tools (e.g., Bitbucket Runners, Jenkins, GitLab CI, GitHub Actions, Azure DevOps).
Experience with cloud platforms (GCP preferred, AWS, or Azure) and their data services.
Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus.
Knowledge of data integration tools and ETL/ELT concepts.
Familiarity with monitoring and logging tools.
Strong SQL skills.
Ability to work independently and as part of a collaborative team in an agile environment.
Strong communication skills, with the ability to explain complex technical concepts clearly.
🏖️ Benefícios
Fully remote with office optional. You decide when you would like to work from home and when from the office.
Flexible timings. You decide your work scheduled.
Market competitive compensation (in $$).
Insane learning and growth
Candidatar-se
📊 Pontuação do Currículo
Envie seu currículo para ver se ele passa pelas ferramentas de rejeição automática usadas por recrutadores
Check Resume Score
Vagas Similares
Engenheiro de Dados GCP Sênior
13 horas atrás
Leega
201 - 500
Site
LinkedIn
Todas as Vagas
Projetar e manter pipelines de dados em GCP/AWS para a Leega Consultoria. Construir modelos de dados, processos ETL/ELT e garantir governança para análises.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
AWS
Cloud
ETL
Google Cloud Platform
NoSQL
SQL
Tableau
Engenheiro de Dados Sênior (SAS e GCP)
13 horas atrás
Leega
201 - 500
Site
LinkedIn
Todas as Vagas
Engenheiro de Dados Sênior liderando migrações de SAS para GCP na Leega, consultoria de analytics. Projetar modelos no BigQuery, construir pipelines ETL e otimizar infraestrutura em nuvem.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
Apache
BigQuery
Cloud
ETL
Google Cloud Platform
Hadoop
Java
PySpark
Python
Scala
Shell Scripting
Spark
SQL
Terraform
Engenheiro de Dados
15 horas atrás
FCamara Consulting & Training
1001 - 5000
Site
LinkedIn
Todas as Vagas
Engenheiro de Dados construindo e mantendo infraestrutura de dados no GCP para a maior construtora do Brasil. Responsável por ETL, BigQuery, modelagem de dados e CI/CD.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
BigQuery
Cloud
ETL
Google Cloud Platform
Python
Spark
SQL
SSIS
Terraform
Engenheiro de Dados
15 horas atrás
FCamara Consulting & Training
1001 - 5000
Site
LinkedIn
Todas as Vagas
Construir e manter pipelines de dados e data lakes baseados em GCP para uma importante empresa de construção da América Latina. Garantir qualidade dos dados, segurança (LGPD) e CI/CD dos produtos analíticos.
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
BigQuery
Cloud
ETL
Google Cloud Platform
Python
Spark
SQL
SSIS
Terraform
Engenheiro de Dados e/ou Arquiteto de Dados
16 horas atrás
Logiks TI
201 - 500
Site
LinkedIn
Todas as Vagas
Construir e manter Data Lakes AWS, pipelines de ETL e arquiteturas de Data Warehouse para consultoria de TI brasileira. Utilizar Python/PySpark, Airflow e serviços analíticos da AWS (S3, Redshift, Glue, Athena).
🇧🇷 Brasil – Remoto
⏰ Tempo Integral
🟡 Pleno
🟠 Sênior
🚰 Engenheiro de Dados
Airflow
Amazon Redshift
AWS
Cloud
PySpark
Python
Ver Mais Vagas de Engenheiro de Dados
Desenvolvido por Lior Neu-ner. Adoraria receber seu feedback — entre em contato por DM ou pelo e-mail support@remoterocketship.com
Buscar
Buscar vagas por cargo
Buscar vagas por stack tecnológico
Vagas remotas tempo-integral
Vagas remotas meio-periodo
Vagas remotas contrato-temporario
Vagas remotas estagio
Vagas Populares
Vagas remotas de analista de dados
Vagas remotas de DevOps
Vagas remotas de assistente executivo
Vagas remotas de designer de produto
Vagas remotas de engenheiro de QA
Vagas remotas de recrutador
Vagas remotas de vendas
Vagas remotas de SDR
Vagas remotas de engenheiro de software
Vagas remotas de pesquisador UX
Recursos
Publicar vaga
Afiliados
Política de privacidade e termos de serviço
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.