This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Lead – Data Quality and Reliability Engineering will architect and build the infrastructure, processes, technology and collaboration model that embeds rigorous data quality and reliability practices to ensure that data across all systems – from ingestion through consumption – is accurate, consistent, and trustworthy. They will spearhead the development of our next-generation data ecosystem with a quality-first mindset, helping shape a platform that is not only scalable and innovative, but also delivers trusted data for critical financial and analytical use cases. As a leader, this position demands a balance of hands-on execution, strategic direction, and team leadership to ensure a reliable, high-quality data platform that instils confidence in analytics, reporting, and decision-making across the firm.
Job Responsibility:
Architect and build the infrastructure, processes, technology and collaboration model that embeds rigorous data quality and reliability practices to ensure that data across all systems – from ingestion through consumption – is accurate, consistent, and trustworthy
Spearhead the development of our next-generation data ecosystem with a quality-first mindset, helping shape a platform that is not only scalable and innovative, but also delivers trusted data for critical financial and analytical use cases
Provide a balance of hands-on execution, strategic direction, and team leadership to ensure a reliable, high-quality data platform that instils confidence in analytics, reporting, and decision-making across the firm
Requirements:
10+ years in at scale data quality, data science, or ML engineering
Experience operationalizing ML or rules-based quality systems at enterprise scale
Deep understanding of data validation, anomaly detection, and data observability
Investment Systems, Capital markets domain knowledge is highly accretive to the role
Proficiency in data architecture protocols and previous involvement with financial or investment data systems is beneficial
A Bachelor’s degree or equivalent experience in Computer Science, Engineering, or related discipline is a must
Establish data quality frameworks that combine multiple approaches: rules-based validation (schema checks, business rules, thresholds), statistical profiling (distribution checks, drift detection), machine learning models for anomaly detection, and LLM-assisted contextual checks for semantic data validation
Integrate these validation models into data pipelines for real-time (synchronous) checks, and implement asynchronous large-scale monitoring for batch data flows
Manage the lifecycle of data quality models – including regular retraining, version control, and performance monitoring – to ensure quality checks remain effective as data evolves
Establish and publish data quality metrics and dashboards (e.g. completeness, timeliness, accuracy, consistency) to provide a transparent, measurable view of data health across the platform
Embed robust observability hooks and automated remediation processes into data pipelines, so that data issues are detected early and addressed proactively without manual intervention
Develop governance around model lifecycle for data quality — from training to deployment
Define and track KPIs for data quality and reliability
Partner with Data Platform Engineering, data governance, and analytics leads to embed quality monitoring throughout data flows
Demonstrated ability to lead and mentor engineering teams, establish high coding standards, and drive continuous improvement in platform reliability and developer efficiency
Excellent collaboration and communication skills, with comfort in bridging technical and business needs to deliver results