GitHub has removed code that is based on DeepNude — an app that uses AI to digitally undress pictures of women and create fake nudes.
While the maker of DeepNude has already shut down the project and made it illegal to use or possess copies of the app, multiple repositories based on the DeepNude algorithm have cropped up on GitHub and also on other platforms.
Among the deleted codebases was a rip-off of the DeepNude app rather than the original one. Another project was “a work-in-progress open-source reimplementation of DeepNude based on reverse-engineering the original.”
GitHub has a policy against “sexually obscene content.” Since these projects were found to be in violation of this rule, the platform banned such projects along with the original one, run by DeepNude’s creator.
“We do not condone using GitHub for posting sexually obscene content and prohibit such conduct in our Terms of Service and Community Guidelines,” said GitHub spokesperson in a statement issued to Fossbytes.
“We do not proactively monitor user-generated content, but we do actively investigate abuse reports. In this case, we disabled the project because we found it to be in violation of our acceptable use policy.”
Nevertheless, last week, the team behind DeepNude uploaded the core algorithm of the app but not the actual app interface, citing that “the reverse engineering of the app was already on GitHub.”
“It no longer makes sense to hide the source code,” wrote the team on its now-deleted page.
They further went ahead and justified open-sourcing the code by claiming that DeepNude could be “useful for researchers and developers working in other fields such as fashion, cinema, and visual effects.”
While there is no denying the actual usage of such algorithms in other fields, the risk of abuse of DeepNude algorithm is far greater than its benefits.
It is also true that nothing can actually make all the copies of DeepNude disappear from the internet. But GitHub’s step can definitely make it harder to find the app and also discourage other developers from tinkering with the algorithm.