Home Office report reveals significant bias in PND retrospective facial recognition tool

05/12/2025 | UK Government

kept police facial recognition flaws to itself, UK data watchdog fumes

New research, published hours after Sarah Jones, Minister for Policing and Crime Prevention, announced plans last week to expand police use of facial recognition, has revealed concerning racial bias in the Home Office's retrospective facial recognition (RFR) technology, used within the Police National Database (PND). 

The research, conducted by the National Physical Laboratory (NPL) and commissioned by the Home Office in collaboration with the Office of the Policing Chief Scientific Adviser, found that the technology was "more likely to incorrectly include some demographic groups in its search results" than others.

Specifically, the NPL report identified that the RFR technology is more likely to incorrectly match Black and Asian individuals than their White counterparts. When analysts examined the tool at a lower setting, the false-positive identification rate (FPIR) for White subjects was 0.04%, significantly lower than that for Asian (4.0%) and Black (5.5%) subjects. The report indicated that the bias was particularly pronounced for Black women, whose FPIR was 9.9%, compared to 0.4% for Black male subjects.

In response to the findings, the Home Office has confirmed that a new algorithm has been developed and independently tested, which can be used at threshold settings that show no significant demographic variation in performance.

In a statement responding to the news, Emily Keaney, Deputy Commissioner at the Information Commissioner's Office (ICO), said that while we "acknowledge that measures are being taken to address this bias. However, it's disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services." 

The statement goes on to confirm that the ICO has asked the Home Office for urgent clarity on the matter, so it can assess the situation and consider our next steps. 

Meanwhile, a separate statement by the Association of Police and Crime Commissioners (APCC) highlights that the NPL report sheds light on a "concerning in-built bias". The APCC acknowledged that while mitigations have been introduced and the report focuses solely on the PND algorithm, it suggests that the technology was deployed into operational policing "without adequate safeguards in place". 

Furthermore, the APCC warned that the lack of any adverse impact in individual cases appears to be "more by luck than design," particularly given that system failures had been known for some time but were not shared with affected communities or sector stakeholders. While the APCC  recognises the importance of embracing emerging technology to combat crime, it stressed that public trust requires full transparency regarding these increasingly invasive tools. 


Training Announcement: Find out more about our range of independent accredited data protection and AI governance qualifications from IAPP and BCS that support practitioners in adopting emerging technologies such as facial recognition.  

Read Full Story
Home Office

What is this page?

You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.

The Privacy Newsfeed monitors over 300 global publications, of which more than 6,250 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.