Met Police Deploys Palantir AI to Monitor Officer Conduct, Sparking 'Automated Suspicion' Concerns
Met Police Uses Palantir AI to Flag Officer Misconduct

Metropolitan Police Employs Palantir AI to Detect Officer Misconduct

The Metropolitan Police, the United Kingdom's largest force with 46,000 officers and staff, has confirmed it is utilizing artificial intelligence tools supplied by the US technology firm Palantir to monitor internal behavior and identify potential professional shortcomings. This revelation follows previous refusals by the Met to disclose its relationship with Palantir, a company also engaged with the Israeli military and Donald Trump's ICE operations.

AI Analysis Targets Sickness, Absences, and Overtime Patterns

According to exclusive reports, the Met is deploying Palantir's AI systems to scrutinize data from multiple internal databases, focusing on metrics such as sickness levels, absences from duty, and overtime patterns. The force asserts that there is evidence linking these factors to failures in standards, culture, and behavior. A spokesperson explained that this time-limited pilot aims to help identify behavioral patterns among officers and staff, aligning with broader efforts to enhance standards and improve organizational culture.

The Police Federation, representing rank-and-file officers, has strongly condemned this approach, labeling it as 'automated suspicion'. In a statement, the federation warned that officers should not be subjected to opaque or untested tools that might misinterpret workload pressures, sickness, or overtime as indicators of wrongdoing. They emphasized the need for proper supervision, fair processes, and human judgment over algorithmic profiling.

Controversies and Broader Implications for Policing

This development occurs amid ongoing controversies within the Met, including failures in officer vetting highlighted by cases like Wayne Couzens' murder of Sarah Everard and issues of discriminatory behavior. The force maintains that Palantir's systems only identify patterns, with officers responsible for further investigation and determinations on standards or performance.

Palantir's involvement extends beyond policing, with significant public sector contracts in the UK, such as a £330 million deal with the NHS for a federated data platform and a £240 million agreement with the Ministry of Defence. The company has also been linked to political figures, including Peter Mandelson, who visited Palantir's showroom with Keir Starmer before Mandelson's dismissal over ties to Jeffrey Epstein.

Political and Regulatory Responses to AI Deployment

Martin Wrigley MP, a Liberal Democrat member of the Commons science, innovation and technology select committee, expressed concerns about employee rights, questioning who oversees Palantir's pervasive role in government operations. Meanwhile, Labour has committed to supporting responsible AI adoption in policing, planning over £115 million in investments over three years to develop and test AI tools across all 43 forces in England and Wales.

A Palantir spokesperson highlighted pride in their software's role in improving public services, including police operations and NHS efficiency. As AI tools become more integrated into law enforcement, debates over transparency, ethics, and the balance between technology and human oversight continue to intensify, shaping the future of policing standards and public trust.