Palantir Extends Influence in UK with Access to Sensitive Financial Data
The US AI company Palantir, co-founded by billionaire Donald Trump donor Peter Thiel, has been awarded a contract by the Financial Conduct Authority (FCA) to access and analyze highly sensitive regulatory data. This three-month trial, costing over £30,000 per week, aims to help the FCA combat financial crimes such as fraud, money laundering, and insider trading. The deal marks a significant expansion of Palantir's reach into the British state, raising fresh concerns about privacy and ethical implications.
Deepening Ties and Growing Controversy
Palantir's appointment follows a competitive procurement process with only one unnamed competitor. The company already holds more than £500 million in UK public contracts, including agreements with the NHS, military, and police. Under this new deal, Palantir will apply its AI system, Foundry, to the FCA's vast "data lake," which includes case intelligence files marked as highly sensitive, information on problem firms, fraud reports from lenders, and consumer complaints to the financial ombudsman. Data sources encompass phone call recordings, emails, and social media posts, according to Guardian revelations.
Privacy and Ethical Warnings from Experts
Campaign groups and experts have voiced strong objections, highlighting "very significant privacy concerns." Professor Michael Levi, a money laundering expert at Cardiff University, noted that while AI could be valuable for tackling financial crimes, there are relevant questions about whether Palantir's owners might tip off associates about methodologies. He emphasized the need for clear protocols on data use. Inside the FCA, sources expressed worries about ethical reliability, questioning how to ensure Palantir does not share sensitive detection methods.
Regulatory Safeguards and Data Control Measures
The FCA has stated that Palantir will act as a "data processor" rather than a "data controller," meaning it can only operate under the regulator's instructions. The FCA retains exclusive control over encryption keys for the most sensitive files, with data hosted and stored solely in the UK. Palantir must destroy data after the contract ends, and any intellectual property derived from the analysis should remain with the FCA. Despite guidelines encouraging synthetic data use, the FCA opted for real data to ensure a meaningful test, though it confirmed Palantir cannot copy data to train its products.
Broader Implications and Historical Context
Palantir's technology has been used by the Israeli military and in US immigration crackdowns, leading leftwing MPs to label it a "highly questionable" company. Recent UK contracts, including a £330 million deal with the NHS and a £240 million agreement with the Ministry of Defence, have sparked resistance from doctors and MPs citing human rights concerns. Christopher Houssemayne du Boulay, a barrister specializing in financial crime, warned that ingesting such data for AI training poses serious confidentiality risks, as it may include personal information from innocent individuals caught in enforcement investigations.
The FCA spokesperson defended the move, stating that effective technology use is vital in fighting financial crime and that strict controls are in place to protect data. Palantir referred comment requests to the FCA, reiterating its commitment to human rights and highlighting past successes in scheduling NHS operations and aiding UK police. As Palantir's role in the UK public sector grows, debates over privacy, ethics, and data security are set to intensify.



