GR8 Tech

Middle Data Engineer

GR8 Tech

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

About the role

  • Working with large datasets (100+ TB) with updates at least hourly;
  • Developing and supporting ETL/ELT processes across multiple data sources;
  • Building and maintaining data warehouses and data marts on AWS (S3, Athena, Redshift), GCP (Cloud Storage, BigQuery), and PostgreSQL;
  • Designing and implementing RESTful APIs (Aiohttp, Flask, FastAPI) for internal and external data consumption;
  • Collecting and processing data from Kafka, Google Analytics, Firebase, Appsflyer, Cloudflare, and other third-party applications;
  • Designing and maintaining a centralized data catalog with well-validated and documented data models;
  • Automating data quality and integrity tests to ensure high data reliability;
  • Developing and integrating semantic layers to standardize data access across teams;
  • Creating and maintaining comprehensive project documentation;
  • Collaborating with cross-functional teams to understand business requirements and translate them into scalable data solutions;
  • Driving continuous improvement in data engineering processes, tools, and best practices;
  • Monitoring data pipelines, resolving incidents, and optimizing performance.

Requirements

  • Minimum of 2+ years of experience in Python / Data Engineering;
  • Hands-on experience with ETL, Data Warehousing, and relational databases (PostgreSQL, Microsoft SQL Server, etc.);
  • Experience with job scheduling and task queues;
  • Familiarity with cloud providers: AWS (S3, Athena, Redshift), GCP (Cloud Storage, BigQuery), or similar;
  • Proficiency in Linux and containerization (Docker);
  • Experience with BDD, TDD, or unit testing frameworks;
  • Extensive knowledge of software design best practices and design patterns;
  • Solid Computer Science fundamentals and database theory (types, pros/cons);
  • Experience in performance tuning of ETL jobs, SQL queries, partitioning, and indexing;
  • Hands-on experience with version control systems (git) and CI/CD pipelines;
  • Familiarity with web/mobile application data sources is a plus;
  • Knowledge or experience with Kubernetes, Apache Airflow, DBT, NoSQL databases (MongoDB, Elasticsearch, Redis), Kafka ecosystem, IaC tools (Terraform, Ansible), Salesforce platforms, and Graph databases (Neo4j, AgensGraph) is a plus;
  • Experience in near real-time data processing and data visualization tools (Tableau, PowerBI, Metabase, Grafana, Kibana, Apache Superset) is a plus;
  • Strong problem-solving, analytical, and data-driven decision-making skills;
  • Excellent communication skills, with the ability to collaborate across technical and non-technical teams;
  • Intermediate English or higher.
Benefits
  • An annual fixed budget that you can use based on your needs and lifestyle. You decide how to allocate it:
  • Sports – gym, yoga, or any activity to keep you active;
  • Medical – insurance and wellness services;
  • Mental health– therapy or coaching support;
  • Home office – ergonomic furniture, gadgets, and tools;
  • Languages – courses to improve or learn new skills.
  • Parental support with paid maternity/paternity leave and monthly childcare allowance;
  • 20+ vacation days, unlimited sick leave, and emergency time off;
  • Remote-first setup with full tech support and coworking compensation;
  • Regular team events – online, offline, and offsite;
  • Learning culture with internal courses, career development programs, and real growth opportunities.

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
PythonETLData WarehousingPostgreSQLAWSGCPLinuxDockerSQLData Engineering
Soft skills
problem-solvinganalytical skillsdata-driven decision-makingcommunication skillscollaboration