Creating AI-Powered Fake Celebrity Porn Is Easier Than It Sounds

AI Powered Fake Porn deepfakes
It’s January 2018, and the rather personal hobby, discovered in December last year by Motherboard, of the Redditor named “deepfakes” has become a topic of public debate. It has now triggered, a new kind of apps that can create AI-assisted fake porn.

Basically, it means swapping the faces of celebrities with porn stars in videos. And it works quite well as the output looks quite like a legitimate one, though, it’s not.

Since it was first known, thousands of people have rushed Deepfakes’ NSFW subreddit, escalating the subscriber count to more than 15K. What’s more is that the tools and the steps required to create such videos are available freely. In fact, “deepfake” now acts as a representative term for such fake adult videos generated using a neural network.

Another Redditor known as ‘deepfakeapp’ went a step ahead and created a desktop app called “FakeApp”, out of deepfakes’ machine learning algorithm. It helps people who don’t have a computer science background to create such videos using their own datasets.

So, creating fake porn videos of celebrities just involves feeding the neural network with a few hundred pictures, and it will create the required face-swapped video. And one doesn’t need to know where to get the images. We “are living” in the age of the internet.

Admittedly, there are chilling implications of the tech that has amused many. Internet portals are getting flooded with nearly convincing fake videos of leading celebs.

Possibilities can’t be left out that such easy-to-use technologies becoming advanced with the tick of the clock can be leveraged not only against celebrities but also to defame political figures. It’s not just about fat cats, the tech could even allow evil minds to harass common people.

However, there aren’t only all bad things associated with the FakeApp. A user ‘derpfake’ used the app to recreate a scene from the movie Rogue One showing Princess Leia as her younger self. It’s a near-identical replication of the efforts made by Hollywood producers.

“Top is original footage from Rogue One with a strange CGI Carrie Fisher. Movie budget: $200m. Bottom is a 20 minute fake that could have been done in essentially the same way with a visually similar actress. My budget: $0 and some Fleetwood Mac tunes,” derpfake wrote.

For now, people can reassure themselves with the fact that taking a serious look at the fake video would reveal its legitimacy. It’s no denying that it would develop over time making the forged adult videos beyond comprehensible but what’s important is up to what extent it might be exploited.

Source: Motherboard via The Verge

Also Read: Adobe Photoshop’s AI-Powered Update Lets You Select Objects With Just One Click

Similar Posts