Addepto

Data Engineer, GCP, Snowflake

Addepto

full-time

Posted on:

Location Type: Office

Location: Poland

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design, develop, test, and maintain data pipelines and ETL/ELT processes using GCP and Snowflake.
  • Implement data ingestion, transformation, and storage solutions for structured, semi-structured, and unstructured data.
  • Build and optimize batch, micro-batch, and real-time data pipelines.
  • Support data migration from legacy systems to cloud platforms (GCP, Snowflake).
  • Collaborate with business and technical stakeholders to translate requirements into scalable data solutions.
  • Work with GCP services such as BigQuery, Cloud SQL, Cloud Spanner, and Cloud Bigtable.
  • Integrate data from various sources and support data platform development.
  • Ensure data quality by implementing validation rules, testing frameworks, and monitoring solutions.
  • Work closely with security teams to ensure data protection, access control, and compliance.
  • Support development of data models and schemas for analytics and reporting.
  • Contribute to CI/CD processes, version control, and infrastructure automation (e.g., Git, Terraform).
  • Collaborate with data scientists, analysts, and engineers to support data-driven use cases.

Requirements

  • 5+ years of experience in Data Engineering or similar role.
  • 2+ years of experience working with GCP or similar cloud platforms (AWS, Azure).
  • Hands-on experience with GCP managed data services (e.g., BigQuery, Cloud SQL, Cloud Spanner, Cloud Bigtable).
  • Experience working with Snowflake.
  • Strong knowledge of SQL and experience with data transformation tools (e.g., DBT or similar).
  • Proficiency in Python for data processing and scripting.
  • Experience with ETL/ELT processes and data pipeline development.
  • Experience working with structured, semi-structured, and unstructured data.
  • Familiarity with data orchestration tools (e.g., Airflow, Dagster or similar).
  • Experience with version control (Git) and CI/CD practices.
  • Experience with Infrastructure as Code (e.g., Terraform, Ansible, or similar).
  • Strong analytical, problem-solving, and troubleshooting skills.
  • Excellent written and verbal communication skills in English.
  • Experience in a client-facing or consulting environment is a plus.
Benefits
  • Work in a supportive team of passionate enthusiasts of AI & Big Data.
  • Engage with top-tier global enterprises and cutting-edge startups on international projects.
  • Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
  • Accelerate your professional growth through career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading training materials and certifications.
  • Choose your preferred form of cooperation: B2B or a contract of mandate, and make use of 20 fully paid days off.
  • Participate in team-building events and utilize the integration budget.
  • Celebrate work anniversaries, birthdays, and milestones.
  • Access medical and sports packages, eye care, and well-being support services, including psychotherapy and coaching.
  • Get full work equipment for optimal productivity, including a laptop and other necessary devices.
  • Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringETLELTdata pipelinesSQLPythondata transformationdata modelingdata qualitydata orchestration
Soft Skills
analytical skillsproblem-solvingtroubleshootingcommunication skillscollaborationstakeholder engagementclient-facing skills