Design, build, and implement applied AI systems that improve decision-making and unlock efficiencies
Collaborate with software engineers and data scientists to solve inference optimization, scalable systems design, and human-computer interaction problems
Identify the most impactful problems to solve and translate commercial objectives into technical solutions
Combine and transform large multimodal datasets to run in-depth statistical analysis
Apply deep learning models to automate large-scale decisions integrated into live business processes
Deliver mission-critical code and solutions that generate tangible economic impact
Work in a client-facing setting to integrate models and solutions into customer workflows
Requirements
We are looking for team players who have excellent quantitative abilities
A degree in computer science, mathematics, statistics, physics, economics, or similar fields from a leading university
In-depth understanding of mathematical and statistical concepts behind the most common machine learning techniques
Familiarity with frontier lab APIs (Gemini, OpenAI, Anthropic), and their advanced features like structured outputs and tool calling
Understanding of LLM inference considerations e.g. input vs. output tokens, prompt caching
Experience working with deep learning models within natural language processing or computer vision (e.g. HuggingFace model import/fine-tuning /inference)
Experience writing code in Python
Experience and willingness to work through the challenges associated with large, unstructured, real-world data sets in a client-facing setting
Ability to see solutions and opportunities that others do not see
High level of commitment and reliability, good balance of pragmatism and perfectionism