Palantir extends reach into British state as it gets access to sensitive FCA data | Palantir | The Guardian
Skip to main contentSkip to navigationClose dialogue1/1Next imagePrevious imageToggle captionSkip to navigationPrint subscriptionsNewsletters Sign inUSUS editionUK editionAustralia editionEurope editionInternational editionThe Guardian - Back to homeThe GuardianNewsOpinionSportCultureLifestyleShow moreHide expanded menuNewsView all NewsUS newsUS politicsWorld newsClimate crisisMiddle EastUkraineUS immigrationSoccerBusinessEnvironmentTechScienceNewslettersThe FilterWellnessOpinionView all OpinionThe Guardian viewColumnistsLettersOpinion videosCartoonsSportView all SportSoccerNFLTennisMLBMLSNBAWNBANHLF1GolfCultureView all CultureFilmBooksMusicArt & designTV & radioStageClassicalGamesLifestyleView all LifestyleThe FilterWellnessFashionFoodRecipesLove & sexHome & gardenHealth & fitnessFamilyTravelMoneySearch input google-search SearchSupport usPrint subscriptionsNewslettersDownload the appSearch jobsDigital ArchiveGuardian LicensingLive eventsAbout UsThe Guardian appVideoPodcastsPicturesInside the GuardianGuardian WeeklyCrosswordsWordiplyCorrectionsTipsSearch input google-search SearchSearch jobsDigital ArchiveGuardian LicensingLive eventsAbout UsUKUS politicsWorldClimate crisisMiddle EastUkraineFootballNewslettersBusinessEnvironmentUK politicsScienceTechGlobal developmentObituaries Palantir, co-founded by the billionaire Donald Trump donor Peter Thiel (pictured), has been appointed for a three-month trial period. Photograph: Rebecca Blackwell/APView image in fullscreenPalantir, co-founded by the billionaire Donald Trump donor Peter Thiel (pictured), has been appointed for a three-month trial period. Photograph: Rebecca Blackwell/APPalantirPalantir extends reach into British state as it gets access to sensitive FCA data Exclusive: Allowing US tech firm to analyse intelligence in name of tackling fraud raises fresh concerns over privacy
Campaign groups rail against Palantir, but the UK contracts keep coming Robert Booth UK technology editorSun 22 Mar 2026 12.00 EDTLast modified on Sun 22 Mar 2026 17.29 EDTSharePrefer the Guardian on GooglePalantir is to be granted access to a trove of highly sensitive UK financial regulation data, in a deal that has prompted fresh concerns about the US AI company’s deepening reach into the British state, the Guardian can reveal.The Financial Conduct Authority (FCA) has awarded Palantir a contract to investigate the watchdog’s internal intelligence data in an effort to help it tackle financial crime, which includes investigating fraud, money laundering and insider trading.The Miami-based company, co-founded by the billionaire Donald Trump donor Peter Thiel, has been appointed for a three-month trial, paying more than £30,000 a week to analyse the FCA’s vast “data lake”, which could lead to a full procurement of an AI system.The deal is part of the FCA’s drive to use digital intelligence to better focus resources on rule-breaking among the 42,000 financial services firms it regulates, from major banks to crypto exchanges.Campaign groups rail against Palantir, but the UK contracts keep comingRead moreThere was only one other, unnamed competitor for the contract. Palantir already has more than £500m in UK public deals, including with the NHS, military and police.The contract has prompted warnings of “very significant privacy concerns”. Palantir is expected to apply its AI system, known as Foundry, to huge quantities of information held by the watchdog, including case intelligence files marked highly sensitive; information on so-called problem firms; reports from lenders about proven and suspected frauds; and data about the public, including consumer complaints to the financial ombudsman.The data includes recordings of phone calls, emails and trawls of social media posts, the Guardian understands. The FCA is one of several UK agencies which aim to stop financial crimes that underpin harms such as the drug trade and human trafficking.The deal has raised concerns inside the FCA. One source said: “Once Palantir understands how we detect money-laundering threats, how do we know that they are ethically reliable enough not to go to share that information?”Palantir’s technology is used by the Israeli military and in the US president’s ICE immigration crackdown, leading to leftwing MPs in the House of Commons last month to call it a “highly questionable” and “ghastly” company. In 2023 it signed a £330m deal with the NHS, which has sparked resistance from doctors, and a £240m contract with the Ministry of Defence in December 2025, which prompted MPs to highlight “reports of serious allegations of complicity in human rights violations and the undermining of democratic processes made against Palantir”.These aren’t AI firms, they’re defense contractors. We can’t let them hide behind their modelsRead morePalantir has previously defended its work, saying it has led to about 99,000 extra operations being scheduled in the NHS, helped UK police tackle domestic violence and that it “takes a rigorous approach to respecting human rights”.Prof Michael Levi, an internationally recognised expert in money laundering at Cardiff University, said there was “serious under-exploitation” of data held by financial regulators, so AI is a potentially valuable technology to tackle financial crimes. But he said it was “a relevant question as to whether Palantir’s owners might tipoff their friends about methodologies”.“What are the protocols agreed between the FCA and Palantir about the onward use of things that they have learned in that process?” he said.The FCA said that the terms of the contract meant Palantir would be a “data processor” not a “data controller” – meaning that it could only act on instruction from the regulator, which said it would retain exclusive control over the encryption keys for the most sensitive files and the data would be hosted and stored solely in the UK. Palantir will have to destroy data after completion of the contract and any intellectual property derived from the data trawling should be retained by the FCA.The FCA considered using dummy data or scrambling company and individual names but decided using real data was the only worthwhile test, even though guidelines encourage the use of synthetic data in pilots.“When the FCA carries out an enforcement investigation, it has powers to compel firms to hand over vast quantities of data,” said Christopher Houssemayne du Boulay, a partner and barrister at the law firm Hickman & Rose who specialises in defending serious and complex financial crime cases. “We could be talking about hundreds of whole email accounts and full financial records. Many innocent people will be caught up in that and the data may contain bank account details, email addresses, telephone numbers and other personal information.“If you ingest that data and use it to train an AI system, there are very significant privacy concerns. There should be serious confidentiality requirements regarding what Palantir does with the data.”The FCA said Palantir could not copy the data to train its products. Palantir referred a request for comment to the FCA.A spokesperson for the FCA said: “Effective use of technology is vital in the fight against financial crime and helps us identify risks to the consumers we serve and markets we oversee. We ran a competitive procurement process and have strict controls in place to ensure data is protected.”Explore more on these topicsPalantirFinancial Conduct AuthorityRegulatorsAI (artificial intelligence)PrivacyData protectionComputingnewsShareReuse this contentMore on this storyMore on this storyCampaign groups rail against Palantir, but the UK contracts keep coming6h ago‘It does feel like an intimidation campaign’: why is US tech giant Palantir suing a small Swiss magazine? 3d agoThese aren’t AI firms, they’re defense contractors. We can’t let them hide behind their models15 Mar 2026Palantir’s NHS England contract ‘opens door to government abuse of power’, health bosses told12 Mar 2026Palantir deals are a threat to our data rights as UK citizens23 Feb 2026Met police using AI tools supplied by Palantir to flag officer misconduct22 Feb 2026Palantir moves headquarters to Miami amid tech’s growing retreat to Florida17 Feb 2026NHS deal with AI firm Palantir called into question after officials’ concerns revealed12 Feb 2026Most viewedMost viewedUKUS politicsWorldClimate crisisMiddle EastUkraineFootballNewslettersBusinessEnvironmentUK politicsScienceTechGlobal developmentObituariesNewsOpinionSportCultureLifestyleOriginal reporting and incisive analysis, direct from the Guardian every morningSign up for our emailAbout usHelpComplaints & correctionsContact usTip us offSecureDropPrivacy policyCookie policyTax strategyTerms & conditionsAll topicsAll writersNewslettersDigital newspaper archiveBlueskyFacebookInstagramLinkedInThreadsTikTokYouTubeAdvertise with usGuardian LabsSearch jobsWork with usAccessibility settings Back to top© 2026 Guardian News & Media Limited or its affiliated companies. All rights reserved. (dcr) |
Palantir, a US-based technology firm co-founded by Peter Thiel, has secured a three-month trial contract with the Financial Conduct Authority (FCA) in the United Kingdom to analyze the regulator’s internal intelligence data. This agreement, which involves paying over £30,000 per week, aims to bolster the FCA’s efforts to combat financial crime, encompassing fraud, money laundering, and insider trading. The company’s usage of its Foundry AI system on the FCA’s “data lake,” which contains sensitive information like case intelligence files, reports from lenders, and consumer complaints, has ignited considerable debate and raised significant privacy concerns.
The contract represents a notable expansion of Palantir’s influence within the British state, adding to its existing substantial deals with the National Health Service (NHS), the military, and the police – totaling over £500 million. Critics, including campaign groups, contend that the arrangement is “ghastly” and presents “very significant privacy concerns,” given Palantir’s history with the Israeli military and its involvement in the US Immigration and Customs Enforcement (ICE) agency’s operations. These concerns highlight Palantir's background as a defense contractor rather than a traditional AI firm.
The FCA intends to leverage Palantir's expertise to enhance its resource allocation and strengthen its ability to identify rule-breaking behavior amongst the approximately 42,000 financial service firms it oversees. The data the company will have access to includes recordings of phone calls and emails, as well as information gleaned from social media platforms. This deep dive represents a significant shift in the FCA's approach to combating financial crime, moving towards a more data-driven methodology.
Crucially, the contract stipulates that Palantir will operate as a “data processor” subject to the FCA's control, with encryption keys and data storage exclusively within the UK. Post-trial, Palantir is obligated to destroy any data generated and retains no intellectual property rights. However, concerns persist regarding the potential for Palantir to share learned methodologies with its client base, given its previous relationships.
To mitigate these risks, the FCA has implemented strict controls, including preventing data duplication for training its AI system and ensuring no copy of the data is shared. Christopher Houssemayne du Boulay, a barrister specializing in financial crime, emphasizes the potential privacy ramifications, suggesting that the ingestion of vast amounts of data could unintentionally implicate innocent individuals. Michael Levi, an expert in money laundering, acknowledges the potential value of utilizing data for crime detection but stresses the critical importance of protocols around data use and potential knowledge sharing.
The FCA’s decision to utilize real data, rather than dummy or scrambled data, reflects its commitment to a robust pilot program, demonstrating a willingness to embrace AI technology in the pursuit of greater operational efficiency, despite acknowledging the heightened privacy risks involved. While the FCA believes that Palantir's technological capability is vital for tackling contemporary financial crime threats, it is simultaneously navigating complex ethical considerations surrounding data protection and governmental oversight, and it has stated that it will retain exclusive control over data encryption and access. |