
Data Scientist
Nuvei
full-time
Posted on:
Location Type: Hybrid
Location: Tel Aviv-Yafo • 🇮🇱 Israel
Visit company websiteJob Level
Mid-LevelSenior
Tech Stack
AWSAzureCloudKafkaMicroservicesPythonPyTorchScikit-LearnSparkSQLTensorflow
About the role
- Develop and implement advanced machine learning and generative AI models to solve complex business problems, enhance product offerings, and improve customer experiences.
- Conduct data exploration and feature engineering to uncover hidden insights and identify opportunities for leveraging data across the business.
- Collaborate with cross-functional teams to understand business needs and provide data-driven solutions and recommendations.
- Stay abreast of the latest developments in machine learning and Gen-AI technologies, applying cutting-edge techniques and tools to drive innovation within Nuvei.
- Evaluate model performance and continuously iterate on and refine models to maintain their accuracy and relevance.
- Communicate complex data concepts and the results of analyses clearly and effectively to stakeholders across the company.
- Utilize big data technologies to process and analyze large datasets efficiently, ensuring scalable solutions for data-driven insights.
- Design and implement big data machine learning solutions using technologies such as Spark, and Kafka for real-time data processing and analytics.
- Explore and integrate new big data technologies and tools into the existing data ecosystem to improve data analytics and model development processes.
- Design, build, and productionize agentic AI solutions (single- and multi-agent) for payment use cases such as risk & fraud detection, dispute/chargeback workflows, merchant support, monitoring, and data quality triage.
- Implement and maintain MCP servers/clients to safely expose tools, data sources, and actions to models; define resources, prompts, schemas, and tool contracts aligned with Nuvei security, compliance, and observability standards.
- Establish evaluation and LLM observability for agents (task success, reliability, latency, cost, safety), with structured telemetry, tracing, and automated regressions.
- Implement guardrails and human-in-the-loop mechanisms (policy checks, approvals, safe tool sandboxes, rate limiting, and Write clear documentation and runbooks for agent behaviors, MCP endpoints, versioning, and incident response.
Requirements
- A Bachelor's or Master's degree in Computer Science, Statistics, Mathematics, or a related field with a strong focus on machine learning or artificial intelligence.
- At least 3 years of hands-on experience in developing and implementing machine learning models, with a strong preference for candidates with experience in generative AI.
- Proficiency in programming languages used in data science, such as Python and familiarity with machine learning libraries (e.g., scikit-learn, TensorFlow, PyTorch).
- Experience with data manipulation and analysis tools, including SQL, and experience working with large data sets.
- Strong understanding of machine learning principles and algorithms, including supervised and unsupervised learning. Demonstrated ability to apply machine learning techniques to real-world problems, from initial concept through data preparation, model development, and deployment.
- Excellent analytical and problem-solving abilities, with a keen attention to detail and a data-driven approach to decision-making.
- Exceptional communication skills, with the ability to convey complex data insights to both technical and non-technical stakeholders.
- Familiarity with cloud computing services (AWS, Azure, or Google Cloud) and experience deploying models in a cloud environment is a plus.
- A track record of continuous learning and adapting to new technologies and techniques in data science and AI.
- Demonstrated experience designing and operating agentic AI systems in production, including planning, tool calling, memory management, and failure recovery.
- Hands-on experience with MCP concepts (servers, tools/resources, prompts, sessions) or similar tool-exposure protocols; ability to define JSON-schema contracts and enforce authorization, auditing, and quota controls.
- Practical knowledge of orchestration frameworks (e.g., graph-based or multi-agent frameworks) and integrating agents with microservices, webhooks, queues, and event streams.
- Experience with RAG, vector stores, and retrieval strategies optimized for tool-using agents; familiarity with evaluation frameworks and LLM telemetry/tracing.
- Understanding of security and governance for AI systems (PII handling, secrets management, RBAC/ABAC, row-level security, data residency).
Benefits
- Private Medical Insurance
- Office and home hybrid working
- Global bonus plan
- Volunteering programs
- Prime location office close to Tel Aviv train station
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
machine learninggenerative AIdata explorationfeature engineeringbig data technologiesSparkKafkaPythonSQLMCP concepts
Soft skills
analytical abilitiesproblem-solvingattention to detaildata-driven decision-makingcommunication skillscollaborationcontinuous learningadaptability