Skip to content
Date Published
April 16, 2024

UPDATE: The proposed criminalisation of creating non-consensual sexually explicit deepfakes did not become an offence as the legal proposals were dropped when the 2024 General Election was called.

The Ministry of Justice has today (16th April 2024) announced that individuals who create sexually explicit deepfakes will face prosecution under a new criminal offence being introduced.

Under the new law, those who create this imagery could face a criminal record and an unlimited fine. If the image is then shared more widely, offenders could be sent to jail. It will also strengthen existing offences, as if a person both creates a deepfake and then shares it, they could be charged with two offences and potentially have their sentence increased.

The End Violence Against Women Coalition is among a broad church of experts highlighting the harm of sexually explicit deepfakes on victims, and the profound impact it has on all aspects of their life – from their wellbeing and mental health to relationships, employment and ability to take part in public life.

Non-consensual pornography* constitutes 96% of all deepfakes found online, with 99.9% depicting women. Tackling this is an urgent human rights and equalities issue.

We welcome this new offence, being created under an amendment to the Criminal Justice Bill, and are grateful to the many survivors and experts who have fought hard for this important win, including Professor Clare McGlynn and Baroness Charlotte Owen.

However, we are concerned that the threshold for this new law rests on the intentions of the perpetrator, rather than whether or not the victim consented to their images being used in this way. The government’s announcement indicates that creating a sexually explicit deepfake will be a criminal offence even if they have no intent to share it but purely want to cause alarm, humiliation or distress to the victim.

The law criminalising image-based sexual abuse, including the filming and sharing of sexual activity without the victim’s consent, was recently amended to remove this requirement to prove the intent of the perpetrator, given the immense difficulties in evidencing this and the loophole it provides perpetrators seeking to avoid prosecution.

Sexual violence in the form of deepfake images has skyrocketed in recent years, with images being viewed millions of times a month across the world. The fake images and videos are made to look hyper-realistic with the victim usually unaware and unable to give their consent to being sexualised in such a way.

Tech platforms must be held accountable

The End Violence Against Women Coalition has long called for better regulation of the tech companies that profit from the abuse of women and girls. 100,000 members of the public backed our successful campaign with Glitch to change the new law so that it addressed violence against women. But whether or not it will deliver actual protections from this abuse now depends on the regulator Ofcom’s interpretation of the law as it creates guidance to hold tech companies accountable.

Along with 44 experts and organisations tackling online violence against women, we wrote an open letter to Ofcom expressing concern that its approach to the new law is severely lacking; calling on the regulator to urgently change course so that this once-in-a-generation opportunity to protect women and girls online is not wasted.

In addition to better regulation, we need quality relationships and sex education in schools and public information campaigns to raise awareness of the harm of online abuse, shift the attitudes that justify and normalise it, and ultimately prevent it from happening in the first place.

Andrea Simon, Director of the End Violence Against Women Coalition (EVAW), said:

“​​We welcome the criminalisation of the creation of sexually explicit deepfakes as a means to drive tech platforms to take preventative action to address this abuse. This law should lead to obligations on tech platforms and payment providers to stop the promotion and facilitation of this deeply harmful form of violence against women – violence they currently profit from richly.

However, unless the government amends the law so that the offence is based on the absence of consent rather than the perpetrator’s intent, there will be a massive loophole in the law. This will not only give tech platforms a free pass to claim their sites are for ‘humour’, it will also give perpetrators a ‘get out of jail free’ card for their defence, which will ultimately prevent victims from accessing justice. Evidencing a perpetrator’s intent is not only incredibly difficult, And we’ve seen this loophole block justice for victims of image-based sexual abuse. The government has finally acknowledged and fixed this problem in that offence, so why are victims of deepfakes, which are increasing exponentially, being left to suffer the same outcome?

We call on the government to close this loophole and to prioritise public information campaigns and education about online abuse, and we urge Ofcom to deliver robust guidance that makes sure tech companies can’t get away with profiting from this harm.”

ENDS
Notes

We use the term ‘pornography’ here per the research cited. However we are clear that these deepfakes constitute sexual violence.

Media contact

Sinead Geoghegan, Head of Communications, media@evaw.org.uk 07960 744 502

Date Published
April 16, 2024
EXIT THE WEBSITE
Back To Top