Skip to main navigationSkip to main content
The University of Southampton
Southampton Law School

PLEAD: Provenance-driven and Legally-grounded Explanations for Automated Decisions

Published: 13 January 2021
Graph on computer screeen

Increasing use of automated decision making by organisations has resulted in an expectation to provide their customers or users with adequate explanations behind those decisions. Customers want to understand if the right data has been used, or that the decision has been made fairly. The use of artificial intelligence (AI) in automated decision making means that explanations are now more sophisticated. They need to take into account the algorithm that was used, the data involved, where the data came from, how the data was selected or how the algorithm was designed, the methods in place to avoid bias.

PLEAD is developing the technology which can generate explanations which satisfy the legal requirements and take into account the new role of AI. The PLEAD team is designing a tool, the Explanation Assistant, that will use provenance data to assist data controllers in explaining automated decisions to their customers/users with confidence.

PLEAD has currently finished its first year of research and development. During this year the PLEAD team has conducted innovative work on the legal requirements for explanations for a number of sectors, including finance, education, local government and law enforcement, and the role of provenance in creating building blocks for automated generation of explanations.

Preliminary results of our work are captured in the report Provenance-based Explanations for Automated Decisions: Final IAA Project Report. PLEAD’s innovative approach has been recognised by the ICO and has been cited in the recent guidance on explainability “Project ExplAIn: Explaining decision made with AI”.1 PLEAD’s panel discussion in this year’s Web Science conference,2 hosting panellists at the forefront of Explainable AI, gave participants the opportunity to debate the challenges of computable explanations and to be introduced to the advances in automated explanation generation.

Entering its second year, PLEAD is currently piloting the automated generation of explanations in two scenarios of automated decision making: credit application decisions and school allocation decisions. The pilots will be showcased in industry and public engagement events to raise awareness on the opportunities presented by provenance-based legally-driven explanations.



1 At p. 59.
2 Niko Tsakalakis, Laura Carmichael, Sophie Stalla-Bourdillon, Luc Moreau, Dong Huynh, and Ayah Helal. 2020. Explanations for AI: Computable or Not? In 12th ACM Conference on Web Science Companion (WebSci '20). Association for Computing Machinery, New York, NY, USA, 77. DOI:https://doi.org/10.1145/3394332.3402900

Privacy Settings