Essex Police Halts Facial Recognition Over Racial Bias Study Findings
Essex Police Pauses Facial Recognition Over Bias Study

Essex Police Suspends Facial Recognition Technology Following Racial Bias Study

Essex Police has paused the use of live facial recognition (LFR) technology after a study uncovered significant racial bias in the system, with black people being disproportionately targeted compared to other ethnic groups. This decision was disclosed by the Information Commissioner’s Office (ICO), which oversees the deployment of such technology by at least 13 police forces across England and Wales.

Study Reveals Disproportionate Targeting of Black Individuals

Commissioned by Essex Police, University of Cambridge academics conducted a study involving 188 actors walking past LFR cameras deployed from marked police vans in Chelmsford. The results, published recently, showed that while about half of the people on a watchlist were correctly identified and incorrect identifications were rare, the system was statistically significantly more likely to correctly identify black participants than those from other ethnic groups. Additionally, it was more effective at identifying men than women.

Dr. Matt Bland, a criminologist and co-author of the report, emphasized the implications, stating, "If you’re an offender passing facial recognition cameras which are set up as they have been in Essex, the chances of being identified as being on a police watchlist are greater if you’re black. To me, that warrants further investigation." The report concluded that this bias raises fairness concerns requiring ongoing monitoring.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Regulatory Warnings and Broader Deployment Context

The ICO has warned other police forces to implement mitigations against accuracy and bias risks. LFR systems, which can be mounted in fixed locations or deployed in vans, have seen increased use, with Home Secretary Shabana Mahmood announcing a five-fold expansion to 50 vans available to every police force in England and Wales. Despite this, the technology has faced criticism for potential biases and inaccuracies.

In a separate incident last month, retrospective face scanning software led to the wrongful arrest of a man for a burglary in a city he had never visited, highlighting broader concerns about misidentification, particularly among people of south Asian heritage. This differs from the bias issue in LFR, which focuses on disproportionate targeting rather than false positives.

Potential Causes and Expert Opinions

Experts suggest the bias in LFR may stem from overtraining algorithms on the faces of black people, a problem that could potentially be rectified by adjusting system settings. A government study by the National Physical Laboratory found that black men were most likely to be correctly matched, while white men were least likely, though this effect was not statistically significant.

Opponents of facial recognition technology argue that the latest research validates long-standing warnings about bias. Jake Hurfurt, head of research and investigations at Big Brother Watch, commented, "Police across the country must take note of this fiasco. AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets."

Police Response and Future Steps

Essex Police has stated that after identifying potential bias, they paused deployments to work with software providers on updates and seek further academic assessment. They have since revised policies and procedures, expressing confidence in resuming deployments to trace and arrest wanted criminals while committing to ongoing monitoring to prevent bias against any community.

The Home Office reports that LFR cameras in London from January 2024 to September 2025 led to over 1,300 arrests for serious crimes, underscoring the technology's role in law enforcement. However, the pause in Essex highlights the need for balanced approaches that address ethical concerns while leveraging technological advancements.

Pickt after-article banner — collaborative shopping lists app with family illustration