qode.world

GCP Data Engineer

qode.world

full-time

Posted on:

Location Type: Hybrid

Location: New JerseyNew JerseyUnited States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Develop, and maintain scalable data pipelines using GCP services
  • Build and optimize ETL/ELT workflows using Dataflow and BigQuery
  • Orchestrate workflows using Cloud Composer (Apache Airflow)
  • Perform data migration from legacy systems (e.g., Teradata, on-prem databases) to GCP
  • Develop reusable and efficient Python-based data processing frameworks
  • Write optimized and complex SQL queries for data transformation and analytics
  • Leverage AI-native engineering tools (e.g., code assistants, automated testing, query optimization tools) to improve engineering throughput
  • Ensure data quality, validation, and governance compliance
  • Monitor and troubleshoot data pipelines and production issues
  • Optimize pipelines for performance, scalability, and cost efficiency

Requirements

  • Overall 6-15+ years with 3-5 years of relevant work experience in GCP Services
  • B.Tech., M.Tech. or MCA degree from a reputed university
  • The candidate must demonstrate proficiency in, Strong hands-on experience in Google Cloud Platform (GCP) – BigQuery, Dataflow, Cloud Composer
  • Proficiency in Python for data processing
  • Advanced knowledge of SQL (joins, window functions, performance tuning)
  • Experience in ETL/ELT pipeline development and migration to cloud
  • Understanding of data warehousing and data modeling concepts
  • Experience working with large-scale distributed data systems
  • Nice-to-have skills
  • Knowledge in Pyspark/Dataproc
  • Knowledge in Linux scripting
  • Knowledge on Github, Jenkins, Jira, etc.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
GCPDataflowBigQueryPythonSQLETLELTPysparkDataprocLinux scripting
Certifications
B.TechM.TechMCA