Home Office report reveals significant bias in PND retrospective facial recognition tool
05/12/2025 | UK Government
New research, published hours after Sarah Jones, Minister for Policing and Crime Prevention, announced plans last week to expand police use of facial recognition, has revealed concerning racial bias in the Home Office's retrospective facial recognition (RFR) technology, used within the Police National Database (PND).
The research, conducted by the National Physical Laboratory (NPL) and commissioned by the Home Office in collaboration with the Office of the Policing Chief Scientific Adviser, found that the technology was "more likely to incorrectly include some demographic groups in its search results" than others.
Specifically, the NPL report identified that the RFR technology is more likely to incorrectly match Black and Asian individuals than their White counterparts. When analysts examined the tool at a lower setting, the false-positive identification rate (FPIR) for White subjects was 0.04%, significantly lower than that for Asian (4.0%) and Black (5.5%) subjects. The report indicated that the bias was particularly pronounced for Black women, whose FPIR was 9.9%, compared to 0.4% for Black male subjects.
In response to the findings, the Home Office advised that a new algorithm has been developed and independently tested, which can be used at threshold settings that show no significant demographic variation in performance.
In a statement responding to the news, Emily Keaney, Deputy Commissioner at the Information Commissioner's Office (ICO), confirmed that the regulator acknowledges "measures are being taken to address this bias. However, it's disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services."
The statement goes on to state that the ICO has asked the Home Office for urgent clarity on the matter, so it can assess the situation and consider our next steps.
Meanwhile, a separate statement by the Association of Police and Crime Commissioners (APCC) highlights that the NPL report sheds light on a "concerning in-built bias". The APCC acknowledged that while mitigations have been introduced and the report focuses solely on the PND algorithm, it suggests that the technology was deployed into operational policing "without adequate safeguards in place".
Furthermore, the APCC warned that the lack of any adverse impact in individual cases appears to be "more by luck than design," particularly given that system failures had been known for some time but were not shared with affected communities or sector stakeholders. While the APCC recognises the importance of embracing emerging technology to combat crime, it stressed that public trust requires full transparency regarding these increasingly invasive tools.
As the story continues to unfold, The Guardian revealed that police forces successfully lobbied for the use of the biased RFR system, despite concerns about its discriminatory impact on women, young people, and ethnic minorities.
Documents indicate that the police were aware of these issues for over a year and sought to reverse an earlier decision that aimed to mitigate bias. While the NPCC initially raised the confidence threshold for matches to reduce bias, they quickly reverted to a lower threshold after complaints of decreased investigative leads. The change reduced the match rate from 56% to just 14%. Although the Home Office has not disclosed the current threshold, the review found that Black women faced false positives nearly 100 times more than their white counterparts, highlighting ongoing concerns over the system's fairness and effectiveness.
Training Announcement: Find out more about our range of independent accredited data protection and AI governance qualifications from IAPP and BCS that support practitioners in adopting emerging technologies such as facial recognition.
What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 6,250 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.