AI and deepfake porn is a form of sexual assault—and we need to talk about it

In 2021, a woman who went by the name of Jodie (not her real name) took to BBC File on 4 to tell her recount of how she received an anonymous email linking her to a porn website. This website had several graphic photos and a video of her engaging in sexual activities. Jodie said she felt “the whole world fall away” as she saw them—it was not her in those videos. They were all AI-manipulated to look like her. These were deepfakes.

Jodie later found out it was one of her closest friends, a man she’d known for years named Alex Woolf, who had been commissioning deepfakes of his best friend in exchange for more photos of her to strangers online who were willing to create the images and videos for him.

When the images first caught Jodie’s attention, she was devastated and confided her hurt in him. But he continued and only apologized when he had been caught and sentenced to a measly 20 weeks in prison and a £100 fine paid to each victim—which many protested as too weak a sentence for an offense that caused Jodie to ponder suicide. Jodie was not his only victim.

Female celebrities and public figures alike have painfully discovered deepfake content of themselves, from Taylor Swift to Alexandria Ocasio-Cortez

Jodie ultimately felt betrayed. She had never consented to her image being used, especially not in an explicit manner, for these fake images to be used as erotic fodder and then to have them all over the internet. Not only that, but it came from someone she had so deeply trusted, only to find out he was combing the internet for anyone willing to make pornographic images of his friend without her permission. These photos circulated online, and Jodie well and truly knew that photos like these could ruin her life. She felt a deep sense of violation—and she was and isn’t the only one.

The deepfake problem affects everyone

In fact, 99 percent of deepfake content is of women, and 98 percent of deepfake content is pornographic. Female celebrities and public figures alike have painfully discovered deepfake content of themselves, from Taylor Swift to Alexandria Ocasio-Cortez. They’ve been making headlines about how deepfake use for non-consensual pornographic use should be criminalized, along with its distribution.

Not only that but everyday women and even young teenage girls who don’t have the same celebrity power or influence are also being targeted, with many of them reporting that colleagues or classmates feed their social media photos into AI-powered apps that “nudify” them and generate explicit images bearing their likeness.

These apps, unfortunately, are becoming more and more popular. More and more women are finding out that there are nude images of them online or being circulated in social circles—images where their faces are almost seamlessly superimposed on someone else’s naked body.

Make no mistake: Deepfakes are being used for selfish, humiliating purposes against so many women’s wishes and can be just as damaging as physical assault. Some, like 14-year-old Mia Janin, have even ended their own lives as a result of non-consensual deepfake pornographic images and the cruel bullying that it came with.

Make no mistake: Deepfakes are being used for selfish, humiliating purposes against so many women’s wishes and can be just as damaging as physical assault

This has, thankfully, led to many bills and legislations against the creation and sharing of deepfake pornography. People are seeing the harm it does, especially given how women are more likely than men to be ostracized and humiliated if they are viewed as or rumored to be promiscuous or sexual beings.

Many online are willing to say it’s harmless, it’s just for some momentary gratification, that they’re not doing anything wrong, and to criminalize the creation of these images is ridiculous. Many of this camp are saying it’s just a sexual fantasy they’re trying to achieve—which shouldn’t be illegal.

However, it’s not just that. Sexual fantasies are imaginary. Deepfake creation is a deliberate act of stripping someone else of their agency and consent so the creator can access their body and sexuality in a way these women never agreed to. It’s a way to reap the benefits of a woman saying yes without ever asking her for a yes. It is assault.

Even if they’re created for personal use and never shared—never mind the fact that the internet is incredibly dangerous and files and images are vulnerable at any moment—it’s still cruel and disgusting. This person never agreed to let you see their body, and yet to go against that and create something to approximate it is a betrayal and has the same intent and often the same impact as physical assault.

Arguing that we should even prioritize the sexual fantasies of deepfake creators and sharers and what they’re willing to do to get as close as possible to their fantasies over the actual agency, choices, and consent of women and girls is an even more ridiculous notion. Women deserve their integrity and self-determination. This is not difficult to grasp, nor should it even be contested. And to create images and videos in spite of that is an act of assault meant to humiliate and exercise power over these women, the same way physical rape is. It’s a selfish act that dehumanizes women and reduces them to simple objects they can discard once they’ve gotten what they need.

Even if they’re created for personal use and never shared—never mind the fact that the internet is incredibly dangerous and files and images are vulnerable at any moment—it’s still cruel and disgusting

This is not “just” image-based sexual abuse. There is no “just.” There is no minimizing it. It is flat-out abuse, parallel to assault, which can leave the same kind of impact. It is also motivated by power and humiliation, the desire to dominate. Alexandria Ocasio-Cortez said the same thing when she saw images of herself, that deepfakes “parallel the same exact intention of physical rape and sexual assault, [which] is about power, domination, and humiliation. Deepfakes are absolutely a way of digitizing violent humiliation against other people.”

They are parallel in another way: Just like physical assault and rape are ways of expressing power and dominion over someone to satisfy a desire, the creation, distribution, and exchange of deepfakes is a type of consumption. Consuming this media is akin to how physical assault is also consumption to satisfy an appetite.

So many of these images and videos are also being used for revenge porn purposes—even more proof that it’s about humiliation and power. To minimize its intent to just gratification is disingenuous, and to separate this specific method of gratification from how it takes away a woman’s ability to say no is just as underhanded.

Even worse is when people respond that women should just stay off the internet, that it comes with the territory, and that they should just not post images of themselves. It is ridiculous that women are always the ones being policed with how they react to harm done to them, rather than the perverts and monsters whose acts are being ignored, normalized, and therefore empowered.

No one is entitled to your body

Existing on the internet as a woman should not be so difficult. Yet, one post can generate dozens of sexually explicit images of you against your will, all because of the entitlement some of these monsters feel they have over your body.

So many of these images and videos are also being used for revenge porn purposes—even more proof that it’s about humiliation and power

Breeze Liu, another woman who was similarly affected by explicit images of her being uploaded to pornographic websites, recounted how after she saw the pictures and their deepfake iterations online, she climbed to her rooftop and contemplated ending it all.

Instead, she tried to find anyone who could help her scrub the images off the internet. She was met with slut-shaming and people trying to encourage her to sympathize with the man who uploaded it, saying he had “just made a mistake.” Liu took matters into her own hands and created AlectoAI. This free-to-use facial recognition software tracks any misuse of your likeness—whether it’s deepfakes, fake profiles, or even revenge porn.

Liu has decided to change the system in order to get justice, not just for herself but for hundreds or even thousands of women whose images are used without their consent every day. Consent is key to her software, and she centers safety and autonomy above everything.

Denying deepfake shouldn’t be a difficult decision

It’s been clear to so many from the beginning that AI-generated pornography and deepfakes will (and already do) cause more harm than good. Technology evolving in this direction can and will be misused—and it has already claimed lives, humiliated so many, and hurt people in ways that cannot be reversed.

It’s crucial now to put down hard limitations and legal boundaries that come with heavy consequences. Woolf’s 20-week sentence was heavily called out as a slap on the wrist compared to the lifetime of pain and betrayal that Jodie will endure. Janin should never have been pushed to the brink, and the young girls who were victimized alongside her should also get the justice they deserve. Liu should also have never had to look down from her rooftop because of the shame she felt.

Technology evolving in this direction can and will be misused—and it has already claimed lives, humiliated so many, and hurt people in ways that cannot be reversed

Many will continue to argue that deepfakes are just pictures, but they are more than that—and we know that. So many people who do advocate for them do know how they can affect the women they’re trying to dominate via image-based violence, but won’t admit it because then they’d be admitting to the mental gymnastics they had to go through to justify their cruelty. They know it; they just won’t say it because then they’ll have to confront the truly evil, misogynist, and absolutely deliberate assault they are carrying out.

And for those who, by some miracle, still don’t truly understand the repercussions and impact of these generated videos and images, you just have to ask a woman in your life what it would be like. They can tell you—and they will tell you how it will feel (or, for those who’ve suffered through it, did feel) like a violation of their consent and autonomy.

Women’s consent and control over their sexuality and the use of their image and likeness should not be contested. It is difficult enough to be on the internet and avoid being sexualized constantly. There is enough consensual pornographic material online for anybody to access and use—there is no need to strip another woman of her choice. Historically, we have been through enough of that. We don’t need to bring it to the digital sphere.

If it really came down to having to weigh between someone’s sexual fantasy and the freedom of a woman to say no, it shouldn’t have to be a difficult decision. The evidence of what deepfakes can and have done should be enough. Respecting a woman’s volition should be enough. And if it’s not, then maybe the problem rests elsewhere, and it may be time to contemplate how you view women, consent, and entitlement altogether.

There has never been a time when the internet was safe for women, and it will only continue to be that way if we don’t point out cruelty and legally label and shame those actions as they are. To reduce all this to “just” pictures will never get it to courtrooms. We must start calling things out as they are: assault and violence. They are violations and deserve to be treated, viewed, and handled as such.

Photo by Cottonbro Studio

Follow Preen on FacebookInstagramTwitterTikTokYouTube, and Viber

Eric Nicole Gonzales Salta: