Facial Recognition Bias Leads to Wrongful Arrest of Asian Man in UK
Facial Recognition Bias Causes Wrongful Arrest of Asian Man

Facial Recognition Error Leads to Wrongful Arrest of Asian Man in UK

Alvi Choudhury, a 26-year-old software engineer, was arrested at his home in Southampton in January after automated facial recognition software deployed by Thames Valley police incorrectly matched him with a burglary suspect in Milton Keynes, 100 miles away. The incident has sparked concerns about racial bias in police technology, as Choudhury, who is of south Asian heritage, was held in custody for nearly 10 hours despite clear differences in appearance from the suspect.

Details of the Arrest and Technology Flaws

According to documents shared with the Guardian by Liberty Investigates, Thames Valley police used facial recognition software that identified Choudhury as a match for a suspect involved in a £3,000 burglary. However, CCTV footage showed the suspect was noticeably younger, with different facial features, including lighter skin, a larger nose, no facial hair, and smaller lips. Choudhury, who wears a beard, expressed anger and confusion, stating, "I just assumed that the investigative officer saw that I was a brown person with curly hair and decided to arrest me."

The software, procured by the Home Office from German company Cognitec, runs about 25,000 monthly searches against 19 million police mugshots. Research commissioned by the Home Office revealed that the technology produces false positives at rates of 5.5% for black faces and 4.0% for Asian faces, compared to 0.04% for white faces at certain settings. Police and crime commissioners have warned of "concerning in-built bias" in the system.

Impact on Choudhury and Legal Action

Choudhury was arrested while working at home, handcuffed in front of his neighbors, and held until 2 a.m., causing distress to his father and preventing him from working the next day. He is now claiming damages against Thames Valley police and Hampshire constabulary, which executed the arrest. His lawyer, Iain Gould of DPP Law, emphasized that "police must ensure that artificial intelligence is not substituted for human intelligence and due diligence."

This is not Choudhury's first encounter with wrongful arrest; in 2021, he was arrested after being attacked on a night out in Portsmouth, with no further action taken. He fears that having multiple mugshots on the police database could lead to more wrongful arrests, asking, "In my head, if a brown person in Scotland robs a bank are they going to come and arrest me?" He also expressed concern about how this affects his work with government clients requiring security clearance.

Police Response and Broader Concerns

Thames Valley police admitted that the arrest "may have been the result of bias within facial recognition technology" but denied it was unlawful, stating the decision was based on a human visual assessment. However, Choudhury reported that officers laughed when he questioned the resemblance and that Thames Valley police later acknowledged he was not the suspect after reviewing the footage.

Warnings about facial recognition technology have been raised repeatedly. In December 2024, the UK's biometrics and surveillance camera commissioner, William Webster, expressed concern over police retaining images of people never charged. Last month, South Wales police paid damages to a black man wrongfully arrested due to similar technology. The Home Office is reviewing guidance and developing a new national facial matching system with an improved algorithm to address these issues.

Thames Valley police has also deployed live facial recognition in locations like Oxford and Slough, capturing 100,000 faces and leading to six arrests. Choudhury is calling for greater transparency on wrongful arrests involving this technology to prevent future incidents.