The UK's data protection regulator is demanding urgent answers from the Home Office after a bombshell report revealed significant racial bias in police facial recognition technology.
Watchdog Raises Alarm Over 'Concerning Inbuilt Bias'
The Information Commissioner's Office (ICO) has formally requested 'urgent clarity' from the Home Office regarding the findings. Testing by the National Physical Laboratory (NPL) on the technology used within the police national database showed it is more likely to incorrectly match Black and Asian people compared to white individuals.
Emily Keaney, the ICO's deputy commissioner, expressed disappointment that the watchdog was not informed earlier, despite regular engagement. 'While we appreciate the valuable role technology can play, public confidence in its use is paramount, and any perception of bias and discrimination can exacerbate mistrust,' she stated. The ICO's next steps could include enforcement action, such as fines or orders to cease using the system.
Shocking Disparities in False Positive Rates
The technical analysis uncovered stark disparities in performance across different demographics. At a standard setting, the false positive identification rate (FPIR) for white subjects was just 0.04%. This jumped dramatically to 4.0% for Asian subjects and 5.5% for Black subjects.
The report highlighted that the error rate was particularly egregious for Black women, with a false positive rate of 9.9%, compared to 0.4% for Black male subjects. These findings emerged just hours after Policing Minister Sarah Jones hailed facial recognition as the 'biggest breakthrough since DNA matching'.
Home Office Response and National Expansion Concerns
In response, a Home Office spokesperson said the department took the findings 'seriously' and had already procured a new algorithm 'which has no statistically significant bias'. They have also asked the police inspectorate and forensic science regulator to review the technology's use.
However, Police and Crime Commissioners warned the report 'sheds light on a concerning inbuilt bias' and urged extreme caution over plans for a national expansion. Proposed wider use could see cameras deployed in shopping centres, stadiums, and transport hubs, raising fears about widespread misuse without proper safeguards.
The deployment of a live facial recognition system by the Metropolitan Police at London's Oxford Circus on 13 May underscores the technology's current operational use, even as these fundamental questions about its fairness remain unresolved.