Tech Stack
AirflowAWSAzureCloudDockerGoogle Cloud PlatformKubernetesPandasPythonReactSQL
About the role
- Design, build, and optimize data pipelines and AI/ML infrastructure to support automation and AI projects across multiple domains
- Develop and deploy LLM-driven applications and automation solutions, focusing on process automation, workflow orchestration, and enterprise integrations
- Support data preparation, curation, and fine-tuning for large language models and ML workflows
- Ensure reliability, scalability, and efficiency of AI applications within TeamViewer’s ecosystem
- Collaborate with both internal and external stakeholders to identify automation opportunities and consult on AI/ML technologies
- Participate in technology evaluations and feasibility studies for new AI/ML tools and frameworks
- Promote best practices in data engineering, machine learning, and responsible AI usage
- Be part of a newly established AI team consulting, building, and deploying automation and AI solutions for internal and external projects
- Collaborate with other teams experimenting with LLMs and ML to align on technologies and best practices
- Report to the AI Team Lead and create measurable business impact in an agile, modern environment
Requirements
- University degree in computer science, data engineering, machine learning, or a related field
- 3+ years of professional experience in data engineering or ML engineering
- Strong experience with Python (pandas, SQLAlchemy, FastAPI, etc.) and data frameworks
- Knowledge of vector databases (e.g., Pinecone, Weaviate, Milvus, pgvector)
- Practical experience with LLMs (fine-tuning, RAG, prompt engineering, evaluation)
- Understanding of LangChain / LangGraph (or willingness to learn)
- Proficiency in SQL and experience working with relational databases and data warehouses
- Familiarity with MCP (Model Context Protocol) and orchestration frameworks
- Experience with machine learning pipelines and model lifecycle management
- Familiarity with enterprise automation frameworks (e.g., Airflow, Prefect, Dagster) and integration platforms
- Fluency in English is mandatory; German is a plus
- Nice to Have: Hands-on experience with cloud platforms (AWS, GCP, Azure) and containerization (Docker, Kubernetes)
- Nice to Have: Background in MLOps tools (MLflow, Weights & Biases, Kubeflow)
- Nice to Have: Knowledge of data governance, security, and compliance for AI solutions
- Nice to Have: Experience in front-end integration of AI tools (e.g., Streamlit, Gradio, or React for prototypes)