Essex Police has halted the use of live facial recognition cameras (LFR) after a study found they identified more black people than other ethnic groups.
The cameras are installed on vans and are designed to identify people on the watch list if they pass by.
Thirteen armies were using it by the end of last year and the Home Secretary said in January Number of LFR vans will increase from 10 to 50.
However, Essex Police said it had stopped using LFR after “potential bias in the positive identification rate” – although it now believes the problem has been fixed by updating the algorithm.
Around 200 people were recruited by researchers at the University of Cambridge to test the LFR during a deployment of the force.
They found that only about half of people on the watch list are correctly identified, and that it is “extremely rare” for a person to be flagged if they are not on the list.
But the study found that black people were “statistically significantly more likely” to be correctly identified than other races. Men were also “more likely” to identify than women.
The researchers said it raised “questions about impartiality that require continued monitoring”.
Essex Police told Sky News that the study was one of two it had commissioned – and the other suggested no bias – but that it had halted deployment to work “with the algorithm software provider” to update the system.
It says “further educational evaluation” has been conducted and it believes the system is ready to hit the streets again.
“We have revised our policies and procedures and are now confident that we can begin to deploy this vital technology as part of policing operations to locate and arrest wanted criminals,” it said in a statement.
“We will continue to monitor all results to ensure there is no risk of bias against any one section of the community”.
‘Real risk of unfairness’
Beyond concerns about identifying some groups more than others, the study also examined how LFR has worked so far for Essex Police.
It said around 1.3 million faces were scanned from August 2024 to February 2025, leading to 48 arrests, or one out of every 27,000 faces.
There was only one foul interference.
The researchers said that different LFR systems and conditions may produce different results and that more testing is needed “to fully understand the performance of the technology.”
As the government looks to increase the use of the technology, questions over privacy and the large number of images captured remain a major concern.
The study said “proportionality, transparency and oversight” were important in deciding when to use LFRs, and the Information Commissioner’s Office (ICO) is also investigating the technology.
“All forces should also routinely test for bias and discriminatory outcomes – whether arising from technology design, training data, or watchlist structure,” the ICO said.
“Without it, there is a real risk of unfairness.”
The Home Office said that if an individual’s image does not match the monitoring list they are removed “instantly and automatically” and that all deployments are “targeted, intelligence-based, time-bound and geographically limited”.
It said more than 1,300 people suspected of serious crimes including rape, domestic abuse and GBH were arrested in London thanks to the LFR between January 2024 and September 2025.
