1 in 4 Brits Unconcerned by Non-Consensual Sexual Deepfakes, Survey Reveals
1 in 4 unconcerned by sexual deepfakes without consent

Alarming Survey Reveals Public Attitudes Towards Sexual Deepfakes

A shocking new survey commissioned by the office of the police chief scientific adviser has found that one in four people in the UK either see nothing wrong with creating and sharing sexual deepfakes or feel neutral about the practice, even when the person depicted has not given their consent. The research, which involved 1,700 respondents, uncovered that 13% felt there was nothing wrong with creating and sharing this digitally altered, AI-generated intimate content without permission, while a further 12% felt neutral about its moral and legal acceptability.

Police Warning on AI-Fuelled Abuse

Det Ch Supt Claire Hammond, from the national centre for violence against women and girls and public protection, issued a stark warning in response to the findings. She stated that "the rise of AI technology is accelerating the epidemic of violence against women and girls across the world." She further accused technology companies of being complicit in this abuse, stating they have made creating and sharing abusive material as simple as clicking a button. Hammond strongly reminded the public that "sharing intimate images of someone without their consent, whether they are real images or not, is deeply violating." She urged victims to come forward, assuring them that this is a serious crime and that they will be supported, adding that "no one should suffer in silence or shame."

Normalisation of Abuse and Call for Action

The report, produced by the crime and justice consultancy Crest Advisory, contained several concerning statistics. It found that 7% of respondents had been depicted in a sexual or intimate deepfake themselves. Alarmingly, of these victims, only 51% had reported the incident to the police. Embarrassment and a belief that the offence would not be taken seriously were the top reasons for not reporting. The data also suggested a troubling trend, indicating that men under 45 were more likely to find it acceptable to create and share deepfakes. This group was also more associated with watching online pornography and holding misogynistic views. Callyane Desroches, the report's author, expressed deep concern, noting that the creation of deepfakes is "becoming increasingly normalised as the technology to make them becomes cheaper and more accessible." She highlighted that the vast majority of this content is sexualised and targets women. The report also revealed that one in 20 respondents admitted to having created a deepfake in the past, and more than one in ten said they would create one in the future.

Activist Cally Jane Beech, who campaigns for better protection for victims, described the situation as "very worrying times," warning of a "dark ripple effect" from a generation that grew up with no digital safeguards. She emphasised that stopping this abuse starts at home with education and open conversation. Under the new Data Act, creating non-consensual sexually explicit deepfakes is a criminal offence, underscoring the legal gravity of this rapidly evolving form of abuse.