12Go

Data Engineer

12Go

contract

Posted on:

Location: 🇦🇿 Azerbaijan

Visit company website
AI Apply
Apply

Job Level

Mid-LevelSenior

Tech Stack

AirflowAWSDockerETLMariaDBPythonSQL

About the role

  • Maintain and stabilize existing ETL pipelines;
  • Deploy and migrate workflows to Airflow;
  • Introduce and support basic Data Quality checks (freshness, completeness, uniqueness);
  • Design and implement pipelines integrating data from APIs, advertising platforms, and internal systems;
  • Participate in migration from Slowly Changing Dimensions (SCD) to Change Data Capture (CDC) approaches;
  • Handle incoming requests from other teams related to the data warehouse (e.g., creating new tables, configuring imports);
  • Connect and integrate new external and internal data sources into the platform;
  • Write clear and maintainable documentation for implemented solutions;
  • Collaborate with cross-functional teams to ensure data availability and reliability;
  • Work within the R&D team to build and improve 12Go's online travel booking platform.

Requirements

  • 3+ years of experience as a Data Engineer;
  • Experience in custom ETL design, implementation, and maintenance;
  • Hands-on experience with Airflow and ClickHouse;
  • Good SQL and Python skills;
  • Proactive, initiative-driven mindset;
  • Understanding of AWS concepts (nice to have);
  • Experience with relational databases such as MariaDB (nice to have);
  • Familiarity with dbt and docker (nice to have).