The app in question was called DeepNude, and it allowed its users to “undress” anyone based on an uploaded image of the person.
While the error in the app developers' ways was realised quickly enough, and the app has been removed from all platforms, it’s more important now than it ever was to speak about the dangers of creating platforms that breed men who cannot control their desire or lust toward women.
READ MORE: Stellenbosch gym masturbator shows how easily men get away with sexual harassment
What’s maybe worse is that DeepNude was developed for the purpose of undressing women, but did not work on men.
This fact alone begs one to consider the kinds of thoughts going through certain men’s minds when they objectify women. Clearly a woman’s best interest, let alone safety, is not being taken into consideration in such instances.
READ MORE: Google keeps an app that allows guardians to track Saudi women's whereabouts and limit their travel
A woman’s privacy and obvious wishes to not want to be seen naked by a complete stranger regardless of whether it was real or not, were completely disregarded.
And the thought is quite scary.
The nude picture may not have been real, but it displayed a woman’s naked body in vivid detail.
What happens when the image alone is not enough? Does this not prompt the viewer to go after the “real thing”?
While it is a strong assumption to make, we cannot ignore the harsh truth about the rate at which women are being raped, abused and even murdered in today’s society. And this was not even the only concern.
Washington Post reports that another concern raised by these DeepFake apps is that people will now be able to make false claims about certain activities.
READ MORE: This woman developed an app that can help locate users who may be in danger of sexual assault
Before technology was as advanced as it currently is, it was easier to spot a photoshopped or edited picture or video, but DeepFakes were designed to create the most realistic impressions using AI, which is a growing concern.
“Deep-fake technologies will enable the creation of highly realistic and difficult to debunk fake audio and video content,” Danielle Citron, a law professor at the University of Maryland, testified before a House committee on the dangers of DeepFakes this month.
“Soon, it will be easy to depict someone doing or saying something that person never did or said. Soon, it will be hard to debunk digital impersonations in time to prevent significant damage,” said Danielle according to this Washington post article.
Meanwhile, the app developers have issued a statement after the app was shut down, saying that “the world was not ready” for DeepNude.
— deepnudeapp (@deepnudeapp) June 27, 2019
And as a woman who lives in constant fear of being objectified and hypersexualised, physically abused, raped or murdered, I don’t think the world will ever be ready for such a distasteful app.
Sign up to W24’s newsletters so you don't miss out on any of our hot stories and giveaways.