London Bridge Facial Recognition Trial Raises Alarm After Children Incorrectly Flagged
The British Transport Police's new live facial recognition trial has barely commenced at London Bridge station, yet significant concerns are already emerging following reports that artificial intelligence monitoring systems have incorrectly identified children as potential offenders during initial testing phases.
How the Recognition System Operates
Cameras were activated at the busy transport hub on 11 February as part of a comprehensive six-month pilot program designed to identify individuals wanted for serious criminal offences as they pass through major railway stations. The sophisticated system continuously scans faces within a specifically designated "recognition zone," comparing these biometric readings against a pre-established watchlist of persons of interest.
When the algorithm suggests a potential match, a human officer then reviews the automated alert before making any determination about whether to intervene. Police authorities emphasize that anyone not appearing on the watchlist will not be identified through this process, and their biometric data will be deleted immediately after capture.
Accuracy Concerns from Previous Trials
This latest deployment follows previous AI monitoring trials on the London Underground network that raised substantial questions about system accuracy. During earlier implementations aimed at combating fare evasion at stations including Willesden Green, children walking closely behind their parents were mistakenly flagged as potential offenders by the recognition technology.
The current system utilizes NEC's advanced Neoface algorithm, which underwent independent testing by the National Physical Laboratory. Police officials state they are operating the technology under recommended settings specifically designed to "minimise the likelihood of any false positive indication and adverse impact on equitability."
Police Perspective and Passenger Options
Chief superintendent Chris Casey, who is overseeing the pilot program, explained: "This is a trial of the technology to assess how it performs in a railway setting. The initiative follows a significant amount of research and planning and forms part of BTP's commitment to using innovative technology to make the railways a hostile place for individuals wanted for serious criminal offences."
Passengers who prefer not to pass through the facial recognition zone will be offered alternative routes through the station, and clear signage has been installed at all participating locations to inform travelers about the ongoing trial.
Growing Scrutiny of Facial Recognition Technology
The Metropolitan Police deployed similar technology 231 times last year alone, scanning approximately four million faces and making 801 arrests "specifically as a result of LFR," according to submissions made during a High Court judicial review of the practice.
Representing the Metropolitan Police, Anya Proops KC told the court that locating wanted individuals across London was "akin to looking for stray needles in an enormous, exceptionally dense haystack." She further argued that privacy intrusion was "only minimal" because data from non-matches is deleted "a fraction of a second" after capture.
Legal Challenges and Regulatory Gaps
Campaign group Big Brother Watch and London resident Shaun Thompson are currently challenging the Metropolitan Police's use of facial recognition technology through legal channels. Thompson was stopped at London Bridge in 2024 after being incorrectly identified by a police van equipped with facial recognition cameras, resulting in approximately 30 minutes of detention before his release.
Matthew Feeney of Big Brother Watch commented: "We all want train passengers to travel safely, but subjecting law-abiding passengers to mass biometric surveillance is a disproportionate and disturbing response. Facial recognition technology remains unregulated in the UK and police forces are writing their own rules."
This regulatory vacuum persists despite a 2020 Court of Appeal ruling that found South Wales Police had used live facial recognition unlawfully due to insufficient safeguards. While police forces now operate under a framework combining common law powers, data protection regulations, and human rights legislation, no single statute specifically governs facial recognition deployment.
The Home Office consultation regarding how facial recognition technology should be properly regulated has yet to conclude, even as additional trials continue across the country. Government ministers have repeatedly indicated that artificial intelligence will play a central role in plans to modernize policing methods throughout the United Kingdom.