Search icon


06th Feb 2022

Deepfakes: Should we be concerned about this terrifying new form of cyber abuse?

Katy Brennan

It’s never been easier to make something fake look real.

Deepfake images and videos are becoming all the more common online.

Think of them as a bigger and badder version of Photoshop – altering images is nothing new, but deepfakes are super convincing and almost impossible to distinguish from reality.

The technology simulates real people in false situations by replacing a person in an existing image with someone else’s likeness.

Many of us will have seen the viral fake videos of Obama or Mark Zuckerburg doing the rounds – there’s even a TikTok account dedicated solely to deepfakes of Tom Cruise.

@deeptomcruiseMy crew!!!♬ Cheer Up – Miles Fisher

Deepfake technology is becoming accessible to everyone, and unfortunately, it’s now being almost exclusively for one thing – pornography.

The vast majority of deepfake content on the internet is sexually explicit images of women.

Actress Bella Thorne is just one of the many female celebrities who has fallen victim to this terrifying form of cyber abuse.

A video Thorne posted on Instagram of herself crying when her father passed away was sickeningly used to superimpose her face onto another woman’s body and create a deepfake of her masturbating.

She has since expressed concern that the technology will be largely used to blackmail women.

“Why would you even put this thing out there when you know what a huge percentage of people are going to do with it?” she told BBC.

She also referred to the technology as “a breeding ground for underage pornography.”

And it’s not just celebrities. 

Last month, a young girl was found dead following the distribution of fake images of her online.

Edited naked photographs of 17-year-old Basant Khaled were posted after she had refused to go on a date with a young man.

After the fabricated images were seen by friends, fellow students, and teachers at her school, Basant tragically took her own life on 23 December.

Sadly, as the technology becomes more readily available, this is becoming ever more common and has been described by Professor Claire McGylnn as an “epidemic”.

McGylnn, who has spent 20 years influencing and shaping law reform – specialising in the legal regulation of pornography, sexual violence and image-based sexual abuse – says this type of thing is happening at an alarming rate.

“It is difficult to put an exact figure [on how common it is] because there are so many thousands of websites dedicated to deep fake pornography,” she tells Her.

“There are also vast swathes of nonconsensual, altered imagery on the mainstream pornography websites.”

To make matters worse, McGylnn says that organisations like Sensity have done surveys in this field and confirmed that up to 97% of all deep fakes on the internet are pornographic images of women.

“We know from surveys of victims of image-based sexual abuse that about a third of those who had images taken without their consent, the images have been altered.

“Deepfake pornography is the new frontier in image-based sexual abuse. 

“This is where we are going to see an exponential rise in the nature and prevalence of abuse.”

In Ireland, thanks to the introduction of Coco’s Lawsharing of intimate images without a person’s consent is now illegal. The wording of the bill also covers images that have been “altered”. Non-consensual images or videos, including deepfakes, can also be reported through

However, in several countries worldwide, including England, it is not a criminal offence to distribute sexual images that have been altered without consent.

“A victims first and immediate demand is usually to try and get material taken down,” McGrath explains. “This is possible, but often is very complicated and complex process. Sometimes the material is taken down, only for it to reappear very quickly.”

In a world engulfed by technology, where most of us openly share our lives online – we are all potential victims. McGrath wants to make it clear that the burden does not fall on women. 

“None of us can protect ourselves from having images of us altered to make some pornographic and sexual. The focus must resolutely be on perpetrators who are harming women and girls.

“It is not up to women and girls to protect ourselves from having images of us altered to make them pornographic and sexual.”