ICO reminds police to follow data protection rules when using facial recognition

Published: 17/03/2026
| ICO

A blog article by Emily Keaney, Deputy Commissioner for Regulatory Policy at the Information Commissioner's Office (ICO), explains why data protection is the foundation of responsible police use of facial recognition technology (FRT).

Keaney writes that for the public in England and Wales to maintain trust in these systems, police forces must ensure they do not jeopardise civil liberties. New ICO research highlights that the public's primary concerns regarding the regulation of FRT include accuracy, proper officer training, and safeguards against bias.

As part of its research, the ICO conducted a number of audits of police forces using FRT. Among those audited, Essex Police has now paused its live facial recognition (LFR) deployments after identifying potential accuracy and bias risks. 

The ICO plans to publish an outcomes report later this year to share learnings across all UK forces. The regulator maintains that successful implementation of FRT depends on strong governance, including clear policies, defined roles, and lawful data handling processes. Furthermore, forces are expected to conduct routine testing for discriminatory outcomes that may arise from technology design, training data, or watchlist composition. The ICO's core message is that data protection laws provide essential safeguards for proportionality and human rights, and that any future regulatory regimes should build on these foundations rather than replace them.

In a statement commenting on Essex police's decision to pause its LFR operations, Jake Hurfurt, Head of Research and Investigations at Big Brother Watch (BBW) said: "BBW warned that Essex Police's failure to check the accuracy of its LFR algorithm would put the rights of thousands of people at risk.

"LFR as a tool of general mass surveillance has no place in a democracy like Britain, but if police are going to use it the very least the public can expect is that it doesn't racially discriminate against people.

"It's deeply concerning that Essex Police appear to have taken this facial recognition company's claims at face value and deployed a system they had not even tested.

"Police across the country must take note of this fiasco. Essex Police have serious questions to answer about how and why this tech was used so widely. AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.

"The government should urgently reconsider its shocking plans to squander millions of pounds of public money on a five-fold increase of LFR across the country."

Additional reporting in The Guardian


Training Announcement: Freevacy offers a range of independent data protection qualifications from IAPP and BCS. Our certified courses are available at foundation and practitioner levels and cover multiple legal jurisdictions, data protection operations management, and the implementation of complex privacy solutions in technical environments. Find out more.

Read Full Story Facial recognition, Met Police, MPS
Facial recognition, Met Police, MPS

What is this page?

You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.

The Privacy Newsfeed monitors over 300 global publications, of which more than 3,250 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.