Survivors and campaigners are disappointed by the government’s latest update on how it will tackle deepfake abuse.
Yesterday (22 January), the government tabled an amendment to the Data (Use and Access) Bill that will criminalise intentionally creating a sexually explicit deepfake without consent. On the surface, this appears to be a step in the right direction. But if you look closer, the proposals leave a lot to be desired.
Why? Well, to secure a conviction against someone who has created a deepfake of you without your consent, under the new proposals, you must prove that the perpetrator intended to cause you “alarm, humiliation, or distress” and/or that they created it for the “purpose of sexual gratification.”
But why should survivors, who know they didn’t consent to this horrific imagery being made, have to prove the perpetrator’s motivation? Shouldn’t their lack of consent be enough to warrant a conviction?
Under the latest proposed legislation, those found guilty of creating deepfake images without consent and causing harm or receiving sexual gratification face an unlimited fine – rather than jail time.
Jodie*, a survivor of deepfake abuse and women’s rights campaigner, describes the legislation as a “missed opportunity to prioritise victims and their lived experiences” She argues that Baroness Charlotte Owen’s Private Members Bill, previously dismissed by the government, was a “consent-based proposal” which recognised that it’s “often impossible for survivors to prove a perpetrator’s intent”.
She says the proposed amendment will “leave countless victims without the justice they desperately need and deserve.”
Jodie’s statement in full:
“The government’s amendment to the Data Bill is a missed opportunity to prioritise victims and their lived experiences. The consent-based proposal from Baroness Owen, which is centred on survivor experiences, offers far greater protections, recognising that it is often impossible for survivors to prove a perpetrator’s intent. A motivation and consent-based law, which the government is now proposing, will leave countless victims without the justice they desperately need and deserve.
There is also an urgent need for clear guidance on solicitation offences. Too often, perpetrators outsource the creation of deepfake images, and under this proposed legislation, victims could face cases being dismissed by the police and CPS. The law must be clear – anything less risks leaving victims vulnerable.
Equally concerning is the decision to reduce this crime to a fine. Deepfake abuse is a form of sexual violence that causes profound harm to its victims. It should be treated with the seriousness it warrants, including the possibility of prison sentences for the most severe cases. Justice demands that we prioritise survivors, uphold their autonomy, and enact laws that deter this devastating abuse.
Tackling deepfake abuse requires more than just legislation. It demands a holistic approach. This includes better funding for support services like the Revenge Porn Helpline, which plays a critical role in helping victims remove damaging content. We also need comprehensive preventative education in schools and targeted public awareness campaigns to reach those outside formal education.
It’s crucial we avoid over-criminalising, particularly young people, but we mustn’t shy away from recognising this crime for what it is – sexual abuse. Laws should reflect the severity of the harm caused while ensuring survivors have access to the justice and support they deserve.”
Cally Jane Beech, a survivor of deepfake abuse and GLAMOUR’s Activist of the Year, said the government has “completely missed the mark”, adding, “If they listened to what the survivors have been campaigning for, they would understand that intent to cause harm leaves perpetrators still able to use images of others without their consent.”