We want this app that undresses women using AI to stay dead

At this point in time, it’s overkill to say we’re living in a Black Mirror episode because of certain technological advancements. But we’re going to use that expression again after finding out there was a DeepNude app.

We say “was” because the developers recently shut down the project after realizing how much users might abuse this. The app was made to “undress women” using artificial intelligence. According to MIT Technology Review, DeepNude would swap women’s clothing with nude bodies, making it easier to humiliate someone online with a realistic-looking photo.

This is just one of the many forms of “deepfakes,” which are made by combining and superimposing existing videos and photos onto source material “using a machine learning technique known as generative adversarial network” or GAN. (The same AI system that did Kylie Jenner’s makeup on Dazed Beauty’s digital cover.)

READ MORE: We are slightly creeped out by the potential of AI makeup

These types of posts have been making rounds online in the past couple of years. Some of the most popular deepfakes are pornographic movies which swaps the faces of porn stars with celebrities. If you do a quick Google search on them, you’ll find several sites that host this material. It’s disturbing AF.

That said, the brief existence of DeepNude just shows how developers can easily give the public access to this software. It’s also problematic that the developers made this for entertainment purposes. They then put it offline for two reasons: 1) The servers kept crashing and 2) They realized having a high volume of users will also mean a high percentage of misuse.

“We thought we were selling a few sales every month in a controlled manner… We never thought it would become viral and we would not be able to control traffic,” the developer said in a Twitter statement. “Despite the safety measures adopted (watermarks) if 50,000 people use it, the probability that people will misuse it is too high. We don’t want to make money this way.”

But really, what did they expect to happen? The fact they chose to make an app that specifically targets women’s bodies is alarming in itself. When a female celebrity’s nudes are spread online without their consent, they are slut-shamed. One recent example was Bella Thorne, who was victim-blamed by Whoopi Goldberg for taking nude selfies that hackers could find.

However, whether a person is famous or not, there are double standards to seeing leaked nude photos of people. If it’s a woman, she’s shamed while men thirst over their bodies. If it’s a man, they’re praised for their “package.”

You might be wondering why target women specifically. Well, Motherboard reported that DeepNude only worked on women. If it was used on a man, it won’t be as realistic because their penises will turn into vulvas. Still, it doesn’t take away the developers’ misogynistic intentions to make it easier for people to create deepfakes of women.

READ MORE: Whoopi Goldberg slut-shamed Bella Thorne for taking nude photos

The claim that DeepNude “dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies” is also absolute bull. It’s like the developers are saying they made this as a response to nonconsensual posting of nudes and explicit videos. But, as Katelyn Bowden, founder and CEO of Badass, told Motherboard, this would’ve been a terrifying platform where revenge porn could easily be made.

Additionally, law professor Danielle Citron also told the outlet, “Yes, it isn’t your actual vagina, but… others think that they are seeing you naked. As a deepfake victim said to me—it felt like thousands saw her naked, she felt her body wasn’t her own anymore.”

The DeepNude app may be offline now, but it doesn’t mean other people wouldn’t jump on the opportunity to create another one like it. Artificial intelligence is an incredible feat, but this isn’t something we should use it for. So, stay vigilant on the internet—not everything you’ll see is real.

 

Photo courtesy of Pexels

For the latest in culture, fashion, beauty, and celebrities, subscribe to our weekly newsletter here  

Follow Preen on FacebookInstagramTwitterYouTube, and Viber

Related stories:
Our Zepeto addiction could easily turn into a Black Mirror episode
We are slightly creeped out by the potential of AI makeup
Whoopi Goldberg slut-shamed Bella Thorne for taking nude photos
Kendall Jenner getting body-shamed for leaked nudes prove no one is immune

Jacqueline Arias: