Kyriba

Data Architect / Platform Specialist – Enterprise, Databricks

Kyriba

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇵🇱 Poland

Visit company website
AI Apply
Apply

Job Level

Mid-LevelSenior

Tech Stack

AWSCloudETL

About the role

  • Design, implement, and evolve enterprise data architectures spanning multiple business domains and use cases.
  • Define and enforce architectural standards and best practices for data modeling, integration, and governance.
  • Ensure data solutions are scalable, secure, and optimized for reporting, BI, advanced analytics, ML, and GenAI workloads.
  • Lead Databricks platform implementation and apply Databricks data design patterns, including Delta Lake architecture and unified analytics.
  • Architect Databricks environments to support batch, streaming, real-time, and advanced analytics; integrate with AWS S3 and enterprise platforms.
  • Act as primary interface between data, IT, business, and analytics teams; drive data standardization across finance, operations, HR, supply chain, and customer domains.
  • Architect and optimize data flows for operational and analytical reporting, BI dashboards (e.g., QlikView), and self-service analytics.
  • Partner with Data Scientists and ML Engineers to ensure ML/GenAI readiness (feature stores, model training, scalable inference).
  • Implement enterprise data governance, data quality, security, compliance frameworks; oversee metadata management, lineage, and cataloging.
  • Evaluate and adopt emerging technologies; foster continuous improvement and best practices in data architecture and platform engineering.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or related field.
  • Extensive experience as a Data Architect or Platform Specialist supporting multiple business domains across large organizations.
  • Proven expertise in designing and implementing data architectures on Databricks and AWS S3.
  • Deep knowledge of data modeling, data warehousing, ETL/ELT, and cloud data platforms.
  • Experience with Databricks best practices for reporting, BI, ML, and GenAI.
  • Strong understanding of BI tools (e.g., QlikView) and their integration with enterprise data platforms.
  • Familiarity with ML/GenAI architectures, workflows, and operationalization.
  • Comprehensive knowledge of data governance, security, and compliance frameworks.
  • Outstanding communication, leadership, and stakeholder management skills.
  • Nice to have: Certifications in Databricks, AWS, or enterprise architecture frameworks (e.g., TOGAF).
  • Nice to have: Experience with data mesh, data fabric, or modern data stack concepts.
  • Nice to have: Exposure to automation and integration platforms (e.g., MuleSoft).

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data architecturedata modelingETLELTdata warehousingDatabricksAWS S3BIMLGenAI
Soft skills
communicationleadershipstakeholder management
Certifications
Databricks certificationAWS certificationTOGAF certification
Kyriba

Mid-Level Data Engineer

Kyriba
Mid · Seniorfull-time🇵🇱 Poland
Posted: 1 hour agoSource: kyriba.wd5.myworkdayjobs.com
AWSETLPythonPyTorchScalaScikit-LearnSQLTensorflow