Addepto

Data Architect, GCP, Snowflake

Addepto

full-time

Posted on:

Location Type: Hybrid

Location: Poland

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Lead data architecture and engineering projects within GCP and Snowflake environments, including migration, transformation, and platform design.
  • Design end-to-end data architectures (ingestion, storage, processing, and serving layers).
  • Define and guide implementation of data integration, ETL/ELT strategies, and pipeline architectures.
  • Develop logical and physical data models, schemas, and data structures across business domains.
  • Guide clients on data strategy, architecture decisions, and best practices across GCP.
  • Design and enforce data governance, security, and access control frameworks.
  • Work closely with security teams to ensure data protection and compliance.
  • Lead client workshops to identify data sources, flows, and requirements.
  • Define future-state architectures, roadmaps, and implementation plans.
  • Collaborate with cross-functional teams (data scientists, engineers, business stakeholders) to deliver data solutions.
  • Evaluate and select tools, frameworks, and architectural patterns.
  • Provide technical leadership and mentorship to engineering teams.
  • Oversee performance optimization, scalability, and cost efficiency of data platforms.

Requirements

  • 7+ years of experience in Data Engineering, Data Architecture, or Data Infrastructure roles.
  • 3+ years of experience working with GCP or similar cloud platforms (AWS, Azure).
  • Proven experience in designing data architectures and large-scale data platforms.
  • Strong experience with GCP managed services (BigQuery, Cloud SQL, Cloud Spanner, Cloud Bigtable).
  • Hands-on experience with Snowflake, including data modeling and performance optimization.
  • Strong expertise in data modeling, database design, and data architecture patterns.
  • Experience with ETL/ELT design, data integration, and migration from legacy systems.
  • Experience working with structured, semi-structured, and unstructured data.
  • Understanding of data governance, security, and compliance frameworks.
  • Experience with Infrastructure as Code (Terraform, Ansible, or similar).
  • Proficiency in SQL and good understanding of Python.
  • Strong analytical, problem-solving, and communication skills.
  • Experience working in a client-facing or consulting environment.
Benefits
  • Work in a supportive team of passionate enthusiasts of AI & Big Data.
  • Engage with top-tier global enterprises and cutting-edge startups on international projects.
  • Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
  • Accelerate your professional growth through career paths , knowledge-sharing initiatives, language classes, and sponsored training or conferences , including a partnership with Databricks , which offers industry-leading training materials and certifications.
  • Choose your preferred form of cooperation: B2B or a contract of mandate, and make use of 20 fully paid days off.
  • Participate in team-building events and utilize the integration budget .
  • Celebrate work anniversaries, birthdays, and milestones.
  • Access medical and sports packages , eye care, and well-being support services, including psychotherapy and coaching.
  • Get full work equipment for optimal productivity, including a laptop and other necessary devices.
  • Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data architecturedata engineeringETLELTdata modelingdatabase designdata integrationInfrastructure as CodeSQLPython
Soft Skills
analytical skillsproblem-solving skillscommunication skillsleadershipmentorship