Tech Stack
AzureDockerMicroservicesNoSQLPandasPythonPyTorchScikit-LearnSQL
About the role
- Collaborate within an agile team to build AI-driven solutions: work with development tickets, contribute to sprint planning and reviews, and ensure alignment with team goals and timelines.
- Drive AI research and experimentation: explore novel approaches, design and run experiments (including A/B tests), and translate findings into actionable improvements.
- Support end-to-end delivery processes: participate in release cycles, demos, and feedback loops to continuously improve AI features and user experience.
Requirements
- Strong foundation in OOP, modular programming, unit testing, code readability, and maintainability.
- Familiar with development practices: Docker, Git workflows, documentation.
- Team collaboration: proficiency at working within the scrum framework.
- Strong experience with LangChain and Azure cognitive services (e.g., Azure OpenAI, AI Search, Document Intelligence).
- Thorough understanding of GenAI and LLM concepts: prompt engineering, LLM tools, RAG architecture and agentic design patterns.
- Strong experience in building RESTful microservices using FastAPI, SQLModel, Pydantic, Pika, Celery, Pymongo; knowledge of SQL and NoSQL databases and message brokers.
- Strong experience with Python async programming and WebSocket-based communication.
- Experienced in mature development practices: Docker, Git workflows, release management, documentation, and linting.
- Familiarity with Azure-centric development: Container Applications, Azure Functions, Service Bus, Blob Storage, etc.
- Optional: Advanced GenAI skills (fine-tuning, Huggingface models, Model-Context-Protocol).
- Optional: Traditional Machine Learning: Pandas, scikit-learn, PyTorch; statistics and machine learning knowledge.
- Optional: Research background with algorithm performance metrics, A/B testing, and feasibility analysis.
- Optional: Experience with visualization and dashboarding tools: Streamlit, Plotly, Python Reflex, etc.