TechBiz Global

Data Engineer

TechBiz Global

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇮🇳 India

Visit company website
AI Apply
Apply

Job Level

Mid-LevelSenior

Tech Stack

ApacheAWSBigQueryCloudETLGoogle Cloud PlatformKafkaMySQLPostgresPythonRDBMS

About the role

  • Design, develop, and maintain data ingestion pipelines using Kafka Connect and Debezium for real-time and batch data integration.
  • Ingest data from MySQL and PostgreSQL databases into AWS S3, Google Cloud Storage (GCS), and BigQuery.
  • Implement best practices for data modeling, schema evolution, and efficient partitioning in the Bronze Layer.
  • Ensure reliability, scalability, and monitoring of Kafka Connect clusters and connectors.
  • Collaborate with cross-functional teams to understand source systems and downstream data requirements.
  • Optimize data ingestion processes for performance and cost efficiency.
  • Contribute to automation and deployment scripts using Python and cloud-native tools.
  • Stay updated with emerging data lake technologies such as Apache Hudi or Apache Iceberg.

Requirements

  • 5+ years of hands-on experience as a Data Engineer or similar role.
  • Strong experience with Apache Kafka and Kafka Connect (sink and source connectors).
  • Experience with Debezium for change data capture (CDC) from RDBMS.
  • Proficiency in working with MySQL and PostgreSQL.
  • Hands-on experience with AWS S3, GCP BigQuery, and GCS.
  • Proficiency in Python for automation, data handling, and scripting.
  • Understanding of data lake architectures and ingestion patterns.
  • Solid understanding of ETL/ELT pipelines, data quality, and observability practices.

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data ingestion pipelinesKafka ConnectDebeziumMySQLPostgreSQLAWS S3Google Cloud StorageBigQueryPythonETL/ELT
Soft skills
collaborationcommunicationproblem-solvingadaptabilityattention to detail