Style/ Beauty

It’s not just Taylor Swift; all women are at risk from the rise of deepfakes

But seeing high-profile women victimised in this way also has a profound impact on regular women and girls. When Ellie Wilson, an advocate for justice reform, tweeted about the troubling response to the deepfakes of Swift, she was met with her own flurry of online abuse. “People threatened to make similar deepfake images of me,” she tells GLAMOUR. “These attacks for merely stating my opinion highlight just how dangerous it is for women to simply exist on the internet.”

Olivia DeRasmus, the founder and CEO of Communia, a social network created by and for women, notes that even speaking up against deepfaking puts other women in danger. “Just talking about [deepfaking] as a woman paints a target on my back, along with other advocates, the female journalists covering this, the #swifties speaking out, and even female politicians who want to tackle the issue.”

Professor Clare McGlynn emphasises that deepfaking represents a threat to all women and girls, citing the “potentially devastating impact on our private and professional lives.”

It’s clear that deepfake technology is rapidly hurtling out of control. Amanda Manyame cites “rapid advances in technology and connectivity” that make it “increasingly easy and cheap to create abusive deepfake content”. She adds, “Cyberspace facilitates abuse because a perpetrator doesn’t have to be in close physical proximity to a victim.

“In addition, the anonymity provided by the internet creates the perfect environment for perpetrators to cause harm while remaining anonymous and difficult to track down.”

Moreover, most countries are ill-equipped to deal with tech-facilitated harms like deepfaked image-based abuse. In the UK, it is an offence – under the Online Safety Act – to share deepfake pornographic content without consent, but it fails to cover the creation of such images. “This gap,” Manyame explains, “has created an enabling environment for perpetrators who know they are unlikely to be discovered or punished. The situation is worsened by the lack of legal accountability governing the tech sector, which currently does not have to ensure safety by design at the coding or creation stage.”

Meanwhile, the tech sector itself is alienating victims. As Manyame tells GLAMOUR, “Content moderation on tech platforms relies primarily on reporting by victims, but reporting mechanisms are generally difficult to use, and many platforms frequently do not respond to requests to remove abusive content or only respond after a long time.”


What is the law on deepfakes in the UK?

According to Micheal Drury, Of Counsel at BCL Solicitors, “There is no direct law prohibiting the sharing of ‘deep fakes’ unless those images are pornographic. In that case, the recently created offences under the Online Safety Act 2023 will mean that a crime has been committed as long as the person whose image is shared (real or fake) has not consented and the person sharing does not believe they have consented.

“There is no direct civil wrong allowing the person said to be shown in the image to sue. For those in the same position as Taylor Swift, the obvious solution is to rely upon the copyright of one’s image (if copyrighted), a breach of privacy or data protection laws; harassment (as a civil wrong), perhaps defamation, or criminal law more generally.”


Can anything be done about deepfake technology? Let’s start with legislation. The Online Safety Act criminalises the sharing – not the creation – of non-consensual deepfake pornography, which could, as Sophie Compton, co-founder of #MyImageMyChoice, a movement tackling intimate image-based abuse, tells GLAMOUR, create “greater accountability for tech companies.” Whether this legislation will be effective is another story.

Sophie explains that the current legislation allows tech companies to effectively “mark their own homework”. She points out that search platforms drive plenty of traffic to deepfake pornography sites – can the Online Safety Act clamp down on this? “The government needs to tackle Big Tech and their role in promoting and profiting off deepfake abuse, and get the sites and web services that are profiting off of abuse blocked from the mainstream internet.”

Professor Clare McGlynn from Durham University notes that while the Online Safety Act has the potential to tackle deepfake pornography, “There is a real risk that the legislation is a damp squib, all rhetoric and little change.” She points out that Ofcom, the UK’s communications regulator, is currently consulting on the guidance it will use to enforce the Act. “Ofcom needs to challenge the social media companies to make a step-change in their approaches […] It should focus on proactive regulation being human-rights-enhancing. It can enable women to live freer lives online.”

Ultimately, though, we need to address the misogynistic culture that empowers users to create harmful, non-consensual content of women. Helen Mort survived being deepfaked; she asks, “What are the cultural and social factors that make people abuse images in this non-consensual way?”

We’re still looking for answers.


GLAMOUR has reached out to representatives for Taylor Swift and X for comment.

If you have had your intimate images shared without your consent, remember that you are not alone, and there is help available. Get in touch with the Revenge Porn Helpline at help@revengepornhelpline.org.uk. There is also a step-by-step guide on notyourporn.com, which should be followed before taking any action.

For more from Glamour UK’s Lucy Morgan, follow her on Instagram @lucyalexxandra.

Products You May Like

Articles You May Like

Wendy Stuart Presents TriVersity Talk! Wednesday, May 1st, 2024 7 PM ET With Featured Guest Anu Singh
Alex Wellkers’ “Fly Away”