Tech Stack
AWSAzureCloudGoogle Cloud PlatformPySparkPythonScalaSparkSQLUnity
About the role
- Advise clients during the conception, planning, and implementation of Databricks-based data & AI solutions
- Analyze requirements and design tailored Databricks solutions
- Build and optimize data pipelines using Spark, Delta Lake, and Databricks-native tooling
- Contribute to project areas including cloud engineering, infrastructure, advanced analytics, and generative AI applications
- Present results and insights to stakeholders and management
- Occasionally travel to Switzerland and internationally to meet clients and collaborate with headquarters in Zurich
- Work from the consulting hub in Athens with project teams to deliver high quality and reliable solutions
Requirements
- University degree in Computer Science or related fields
- Strong knowledge of Databricks and the Spark ecosystem
- Hands-on experience with Delta Lake, Databricks SQL, MLflow, Unity Catalog, and Databricks Workflows
- Experience in working with large-scale data sets, streaming data, and databases
- Proficiency in relevant programming languages (e.g. SQL, Python, PySpark, Scala)
- Experience with at least one major cloud platform: Azure, AWS, or GCP
- Down-to-earth and pragmatic, result-oriented attitude
- Excellent communication skills in English