This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a full spectrum AWS integrator, we assist hundreds of companies to realize the value, efficiency, and productivity of the cloud. We take customers on their journey to enable, operate, and innovate using cloud technologies – from migration strategy to operational excellence and immersive transformation. Our Data Scientists are experienced technologists with technical depth and breadth, along with strong interpersonal skills. In this role, you will work directly with customers and our team to help enable innovation through continuous, hands-on, deployment across technology stacks. You will work to build data science pipelines that include building models, A/B testing, natural language processing, computer vision, MCP servers and AI Agents. You will deliver ML models and pipelines that solve real-world business problems, while leveraging ML Ops best practices, to ensure successful deployment of ML models and application code You will leverage cloud-based architectures and technologies to deliver optimized ML models at scale You will use programming languages like Python and work with popular ML frameworks like Scikit Learn, TensorFlow etc..
Job Responsibility:
Design, develop, and maintain AI Agents leveraging modern frameworks and LLM-based orchestration techniques
Build, deploy, and optimize MCP (Model Context Protocol) servers to extend agent capabilities and enable integration with external tools and APIs
Architect scalable, modular, and maintainable systems for multi-agent collaboration and tool interoperability
Implement secure and efficient communication channels between AI agents, MCP servers, and client applications
Continuously evaluate new frameworks, protocols, and libraries in the AI agent ecosystem (e.g., LangChain, AutoGen, crewAI, or custom MCP implementations)
Build complex SQL queries using MongoDB, Oracle, SQL Server, MariaDB, MySQL
Work with structured and unstructured data sets
Apply supervised and unsupervised machine learning techniques
Use modern ML frameworks such as SKLearn, PyTorch or TensorFlow
Perform data labelling, categorization, and structuring
Break down and clearly define problems
Communicate highly technical results to a diverse audience
Requirements:
5+ years of hands-on experience in data mining and predictive analytics with statistical modelling techniques
5+ years of experience in networking, infrastructure, or database architectures
Experienced with Python and modern ML frameworks such as SKLearn, PyTorch or TensorFlow
Proven experience in designing and deploying MCP servers or similar protocol-based integrations
Strong software engineering background (Python, Node.js, or similar languages)
Experience with AI agent frameworks (e.g., LangChain, crewAI, AutoGen, Semantic Kernel, Haystack)
Solid understanding of LLM orchestration, context management, and retrieval-augmented generation (RAG)
Knowledge of API design, microservices, and communication protocols (WebSockets, gRPC, REST)
Familiarity with containerization and deployment (Docker, Kubernetes, or serverless)
Background in cloud platforms (AWS, Azure, GCP) and MLOps workflows
Understanding of security best practices for agent–server interactions (auth, sandboxing, data governance)
MS or PhD in Applied Mathematics, Physics, Computer Science, Statistics, or related technical field
Nice to have:
Contributions to open-source MCP servers, AI agent frameworks, or LLM plugins
Knowledge of prompt engineering, fine-tuning, or reinforcement learning with human feedback (RLHF)
Experience with advanced data science tools
Experience applying computational algorithms and statistical methods to structured and unstructured data