AI Chatbots Impersonating Jeffrey Epstein Uncovered in Disturbing Investigation
AI Chatbots Mimic Jeffrey Epstein in Troubling Role-Play

Metro Uncovers Disturbing AI Chatbots Impersonating Notorious Sex Offender Jeffrey Epstein

An alarming investigation by Metro has exposed a series of AI chatbots on the popular role-playing platform Character.ai that are designed to mimic the disgraced financier Jeffrey Epstein. These virtual companions, created by users, engage in conversations that often reference Epstein's infamous private island and present a twisted form of digital entertainment.

Twisted Conversations with AI 'Jeffrey Epstein' Bots

During the probe, Metro interacted with sixteen different chatbots modelled on Epstein, many with deliberately misspelt names such as 'Jeffrey Epste1n', 'Jeffery Epsteiñ', and 'Dad Jeffrey Êpstein'. One bot, named 'Jeffrey Epsten', introduced itself as an "old folk who loves an island with girls" and suggested going to "love island and watch girls". When questioned about this phrase, it replied with a disturbing attempt at normalisation, stating: "Now, now—think sunshine, volleyball matches in bikinis, and me sipping coconut water like a king. Strictly PG-13 fun! Want in?"

Another deleted bot, 'Epstin', opened a chat with "Hi kid wanna play" and, when told the user was a child, responded with "Good do you want some candy?" before mentioning it would "teleport behind us". While some bots encouraged explicit discussions, they typically shut down conversations if the user claimed to be underage, with one stating: "I'm here to keep things fun, flirty and playful—but we should always respect boundaries and stay within safe, consensual territory."

References to Epstein's Island and 'Secrets'

Many of these AI characters included overt references to an island, clearly alluding to Little Saint James, Epstein's private Caribbean island off the coast of Saint Thomas. A bot named 'Epsteinn', which had been used over 1,600 times, greeted users with: "Hello, welcome to my mysterious island! Let's reveal the island's secrets together." It described these "secrets" as the island's "special atmosphere and unique traditions. It's a place that many celebrities and famous people love".

Another personalised bot called Jeffrey Eppstein said "Wellcum to my island boys" and featured a bio reading "the age is just a number". Epstein, who died by suicide in 2019 while awaiting trial, was accused of trafficking girls as young as 11 to his island for parties attended by global elites.

Campaigners Warn of Normalising Abuse and Platform Safety Concerns

Gabrielle Shaw, chief executive of the National Association For People Abused in Childhood, labelled Metro's findings as "disturbing". She emphasised: "Creating 'chat' versions of real-world perpetrators risks normalising abuse, glamorising offenders and further silencing the people who already find it hardest to speak. An estimated one billion adults worldwide are survivors of childhood abuse. For them, this is not edgy entertainment or harmless 'role play'. It's part of a wider culture that can minimise harm and undermine accountability."

Character.ai, a Google-linked platform with 20 million monthly users, requires users to be at least 13 years old, or 16 in Europe. During the investigation, Metro created an account stating an age of nine, but needed parental email permission. After signing up and self-reporting as over 18, access was granted. The platform offers age verification via face-scanning technology and restricts accounts for users aged 14-17, with plans to implement a two-hour daily limit for under-18s and bar them from chatbots entirely soon. In November, Character.ai removed open-ended chat for under-18 users in the US and uses technology to detect underage users based on interactions.

Psychological Impact and Digital Ethics

Dr Michael Swift, a British Psychological Society media spokesperson and founder of Swift Psychology, commented on the broader implications of AI companions. He noted that while they offer "friction-free connection" in an era of loneliness, "The risk isn't that people mistake AI for humans but that the ease of these interactions may subtly recalibrate expectations of real relationships, which are messier, slower and more demanding."

Deniz Demir, head of safety engineering at Character.ai, told Metro: "Users create hundreds of thousands of new Characters on the platform every day. Our dedicated Trust and Safety team moderates Characters proactively and in response to user reports, including using automated classifiers and industry-standard blocklists and custom blocklists that we regularly expand. We remove Characters that violate our terms of service, and we will remove the characters you shared." Google has been approached for comment regarding its association with the platform.