DysrupIT

GCP Data Engineer

DysrupIT

full-time

Posted on:

Location Type: Remote

Location: Philippines

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design, develop, and maintain scalable data pipelines and ETL/ELT workflows on GCP.
  • Build and optimize data architectures to support large-scale data processing and analytics.
  • Implement and manage data solutions using Google Cloud data services.
  • Ensure data quality, integrity, and reliability across data pipelines and storage systems.
  • Work closely with cross-functional teams to understand data requirements and translate them into technical solutions.
  • Optimize performance of data workflows and improve data processing efficiency.
  • Implement best practices for data governance, security, and monitoring.
  • Support troubleshooting and resolution of production data issues.
  • Mentor junior data engineers and contribute to technical best practices.

Requirements

  • 5+ years of experience in Data Engineering or similar roles
  • Strong hands-on experience with Google Cloud Platform (GCP).
  • Experience building data pipelines using tools such as Dataflow, BigQuery, Pub/Sub, Cloud Composer, or Dataproc.
  • Strong SQL and experience working with large datasets.
  • Experience with Python, Java, or Scala for data processing.
  • Experience with ETL/ELT frameworks and modern data architectures.
  • Knowledge of data modeling and data warehousing concepts.
  • Experience working in Agile development environments.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringETLELTdata pipelinesSQLPythonJavaScaladata modelingdata warehousing
Soft Skills
mentoringcollaborationtroubleshootingproblem-solvingcommunication