It’s a commonly said thing that what goes on the internet stays forever. Last week, a bizarre AI app called DeepNude was untethered by Vice, which could easily remove clothes from images of women.
Later, we reported that there were numerous fake copies on the DeepNude app available on the internet. There are various websites providing alleged Android APK files of the said app. However, the app was only designed for Windows and Linux-based operating systems.
The creator of the app said in his defense that he “created the project for the user’s entertainment” and wanted to make some money by selling copies of the app to support himself. Ultimately, his financial goals didn’t turn out as planned.
He took the app offline saying there is a high chance that the app could be misused but he doesn’t want to make money this way.
Still, there was a possibility that copies of the original app could be lying down somewhere on the internet and might surface later. A Vice story on Monday confirmed that a version of the app was being sold on Discord for $20.
The creator has already said that people should not associate him with any copies of the DeepNude app sold in the future.
We are now hearing more instances of the DeepNude app being available on different platforms. According to The Verge, the DeepNude app is available on various online platforms like YouTube video descriptions, Telegram channels, 4Chan, etc.
An open-source version of the app is also available on GitHub, where the creator of the app has criticized media reports as attempts to generate publicity.
For many of these copies, their uploaders claim to have tweaked the DeepNude app to remove watermarks that the original app adds to the final image.
Overall, the entire DeepNude drama has raised concerns because it can fuel abusive acts like revenge porn and humiliating people on public platforms. This adds to the existing threat posed by existing deepfake technology. It isn’t just limited to social media platforms, deepfake videos are making their way to various pornography websites as well.
One noticeable thing about DeepNude output images is that they are of inferior quality and can be easily labeled as fake. However, it doesn’t apply to all of them and some images could be mistaken for real ones.
It’s also said that manually tweaking the images using tools like Photoshop can give far better results, but it takes a lot more time than 30 seconds taken by the DeepNude AI. And that gap is going to widen as the technology is getting better and better every day.
We are living in an age where it’s possible to create animations of people who never existed. So, eventually, this infamous tech might be able to give even better results.