Sequoia Connect

Data Engineer

Sequoia Connect

full-time

Posted on:

Origin:  • 🇲🇽 Mexico

Visit company website
AI Apply
Manual Apply

Job Level

Mid-LevelSenior

Tech Stack

BigQueryCloudCyber SecurityETLGoogle Cloud PlatformJavaPythonSQL

About the role

  • Build and maintain data pipelines and ETL/ELT processes on Google Cloud Platform (GCP) to ensure reliable and efficient data flow.
  • Collaborate with Senior Data Engineers and cross-functional teams (Data Scientists, Product Managers) to gather requirements and implement solutions.
  • Implement data models, schemas, and transformations to support analytics and reporting.
  • Monitor, troubleshoot, and optimize pipelines to ensure data quality, integrity, and performance.
  • Ensure compliance with data governance, security, and regulatory standards within the GCP environment.
  • Document data workflows, tools, and best practices to support scalability and operational excellence.
  • Stay up to date on GCP services and trends to continuously improve infrastructure capabilities.

Requirements

  • Bachelor’s degree in Computer Science, IT, Data Engineering , or a related field.
  • Minimum 3 years of experience in data engineering, including building pipelines on Google Cloud Platform (GCP).
  • Proficiency with GCP tools, such as BigQuery, Dataflow, Pub/Sub, or Cloud Composer.
  • Strong skills in Python or Java, and advanced SQL for data processing.
  • Experience in data modeling, schema design, and data warehousing.
  • Understanding of data governance and cloud security practices.
  • Familiarity with Git and basic CI/CD practices is a plus.
  • Strong problem-solving and communication skills for technical collaboration.
  • Languages: Advanced Oral English. Native Spanish.