PRODYNA

AI Engineer

PRODYNA

full-time

Posted on:

Location Type: Hybrid

Location: AthensGreece

Visit company website

Explore more

AI Apply
Apply

About the role

  • At PRODYNA, we design, implement, and operate custom software solutions for mid- to large-sized enterprises. We help our clients unlock the full potential of their data through modern cloud platforms, scalable architectures, and cutting-edge technologies. We foster a culture of collaboration, continuous learning, and innovation—where engineers can shape impactful solutions and grow their careers.
  • **Your Role**As a AI Engineer** with Fabric & AI expertise**, you will design and build modern data platforms and intelligent data solutions that power analytics and AI-driven decision-making.You’ll work closely with Data Architects, ML Engineers, and business stakeholders to deliver scalable, high-performance data ecosystems.
  • - Design and implement **modern data platforms** for large-scale data integration and processing
  • - Build and optimize **data pipelines (ETL/ELT)** across cloud environments
  • - Develop solutions using **Microsoft Fabric (Lakehouse, Data Factory, Synapse, Power BI)**
  • - Enable **AI/ML use cases** by preparing high-quality, feature-ready datasets
  • - Integrate data from multiple sources (batch, streaming, APIs, on-prem & cloud)
  • - Ensure **data quality, governance, and reliability** across the platform
  • - Collaborate with stakeholders to define requirements and deliver data-driven solutions
  • - Contribute to **data architecture design** (Data Mesh, Lakehouse, etc.)
  • - Support deployment and automation using **CI/CD and DevOps practices**

Requirements

  • - 3+ years of experience in **Data Engineering or Data Platform development**
  • - Strong SQL and programming skills (**Python, Scala, or Java**)
  • - Experience with **cloud platforms** (Azure preferred, AWS or GCP also welcome)
  • - Hands-on experience with **Microsoft Fabric** or related Azure data services
  • - Solid understanding of:
  • - Data modeling (Kimball, Data Vault, etc.)
  • - Data warehousing & lakehouse architectures
  • - Distributed data processing (e.g., Spark)
  • - Familiarity with **AI/ML workflows** (feature engineering, data prep, MLOps)
  • - Experience with tools like **Databricks, DBT, Kafka, or Snowflake** is a plus
  • - Strong analytical mindset and problem-solving skills
  • - Fluent in English
  • - Entitled to work in EU
Benefits
  • **Compensation & Perks **
  • - Salary: We will settle on the exact compensation amount based on prior experience and skills.
  • - Private health insurance & Life Insurance from day #1
  • - Health management scheme (weekly sessions & monthly challenges)
  • - 25 vacation days
  • - Team events tech oriented and more
  • - International network
  • - Lunch in the office
  • - Employee referral programme/ bonus
  • **Dedicated budget for: **
  • - Employee education ~ 800€
  • - Hardware selection (MacBook or Lenovo ThinkPad) with your own mobile ~ 3000€
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
Data EngineeringData Platform developmentSQLPythonScalaJavaData modelingData warehousingDistributed data processingAI/ML workflows
Soft Skills
analytical mindsetproblem-solving skillscollaborationcontinuous learninginnovationcommunication