EHRC intervenes in legal challenge against facial recognition use by Met Police
20/08/2025 | Equality and Human Rights Commission
The Equality and Human Rights Commission (EHRC) has been granted permission to intervene in a judicial review challenging the Metropolitan Police Service (MPS) over its use of live facial recognition (LFR) technology. The legal action, brought by Shaun Thompson, an anti-knife crime community worker who was wrongly identified by the technology and Silkie Carlo, Director of Big Brother Watch, raises issues of significant public importance.
The ECHR argues that the current LFR policy adopted by the MPS is incompatible with Articles 8 (right to privacy), 10 (freedom of expression), and 11 (freedom of assembly and association) of the European Convention on Human Rights.
According to the EHRC, LFR use has become far more frequent since a Court of Appeal ruling against South Wales Police in 2020 found its deployment to be unlawful. The EHRC's submission argues that the increased scale of LFR use poses a considerable threat to human rights and could have a "chilling effect" on the rights to freedom of expression and assembly, particularly when used at protests.
The regulator also notes that LFR accuracy is paramount, as even low error rates can lead to a significant number of false identifications, which can have serious consequences for individuals. The submission highlights that data shows a disproportionately higher number of false alerts for Black men. While the EHRC welcomes the fact that MPS has recently adopted a minimum accuracy threshold, it notes the continued risk of misidentification.
EHRC Chief Executive, John Kirkpatrick, stated that while LFR can be a useful tool, its use must be governed by clear rules that ensure it is necessary, proportionate, and has appropriate safeguards. The EHRC believes the policy adopted by the MPS falls short of these standards and must be made consistent with the law and human rights.
In a statement responding to the news, Interim Director of Big Brother Watch, Rebecca Vincent, said the ECHR's "intervention in this landmark legal challenge is hugely welcome, necessary, and incredibly timely. The rapid proliferation of invasive live facial recognition technology without any legislation governing its use is one of the most pressing human rights concerns in the UK today. Live facial recognition surveillance turns our faces into barcodes and makes us a nation of suspects who, as we've seen in Shaun's case, can be falsely accused, grossly mistreated and forced to prove our innocence to authorities.
"We are supporting this case precisely because of the fundamental rights issues at stake, and it's important that the human rights regulator weighs in on this too. No other democracy in the world spies on its population with live facial recognition in the cavalier and chilling way the UK is starting to, and it is alarming that the Government is seeking to expand its use across the country. Given this crucial ongoing legal action, the Home Office and police's investment in this dangerous and discriminatory technology is wholly inappropriate and must stop."
The news comes as eleven civil liberty and anti-racist groups have called on MPS commissioner Mark Rowley to scrap plans to deploy LFR cameras at the Notting Hill carnival. Rowley has since responded, clarifying that the LFR cameras would be used "in a non-discriminatory way" with the algorithm set so that it "does not perform in a way which exhibits bias."
News of the EHRC intervention stands in sharp contrast to a glowing audit report published this week by the Information Commissioner's Office (ICO). The report, which assessed the facial recognition technology (FRT) practices of South Wales Police and Gwent Police, found the forces' processes and procedures provided a "high level of assurance" and complied with data protection laws.
In related news, academics from the University of Oxford have criticised the deployment of LFR technology, arguing that its real-world performance is significantly worse than laboratory-based benchmark tests suggest. In a post for Tech Policy Press, researchers claim that an evaluation by the US National Institute of Standards and Technology (NIST) reporting figures as high as 99.95% accuracy fails to reflect real-world conditions where images may be blurred or obscured. The researchers also claim that the datasets used are too small and do not accurately represent real-world demographics. The academics cite several public failures by LFR technology as evidence, including the misidentification of Thompson along the wrongful arrest of a Detroit man.
Meanwhile, a separate study from the University of Pennsylvania in May supports these claims, finding that the accuracy of LFR technology degrades under poor image conditions, with false positive and false negative rates disproportionately affecting individuals from marginalised racial and gender groups.
What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 6,250 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.