This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Embark on a transformative journey as a Data Scientist - AVP. At Barclays, our vision is clear – to redefine the future of banking and help craft innovative solutions. As a Data Scientist within USCB Operations, you will enhance our modeling and analytics capabilities by designing, developing, and deploying Machine Learning, Generative, and Agentic AI models. Your work will focus on improving operational efficiency, elevating customer and colleague experiences, and supporting strategic decision-making. This role involves building large-scale data pipelines, utilizing cloud infrastructure (primarily AWS), and applying generative AI techniques to drive innovation. You’ll collaborate across teams to turn complex data into actionable insights and scalable solutions.
Job Responsibility:
Identification, collection, extraction of data from various sources, including internal and external sources
Performing data cleaning, wrangling, and transformation to ensure its quality and suitability for analysis
Development and maintenance of efficient data pipelines for automated data acquisition and processing
Design and conduct of statistical and machine learning models to analyse patterns, trends, and relationships in the data
Development and implementation of predictive models to forecast future outcomes and identify potential risks and opportunities
Collaborate with business stakeholders to seek out opportunities to add value from data through Data Science
Requirements:
Relevant knowledge in areas such as data science, computer science, statistics, or other related technical fields
Proven track record of designing, building, and deploying production-grade machine learning models at scale using Python (scikit-learn, TensorFlow, PyTorch) in cloud environments, with demonstrable business impact
Experience in advanced statistical methods, feature engineering, and model optimization, with experience handling complex, high-dimensional datasets and real-time streaming architectures (Kafka, Kinesis, Spark Streaming)
Practical experience architecting end-to-end ML pipelines on cloud platform such as AWS or Azure, including S3 for data lakes, SageMaker for model training/deployment, Lambda for serverless inference, and Bedrock for foundational models, with understanding of MLOps best practices
Experience containerizing ML applications with Docker and orchestrating them with Kubernetes for scalable deployments
Nice to have:
Exposure to generative and agentic AI frameworks, including techniques for fine-tuning and orchestration
Familiarity with MLOps practices and tools for model lifecycle management
Prior experience in financial services, risk modeling, or operational analytics