ted07 - It took us less than 30 seconds to find banned ‘deepfake’ AI smut on the internet

It took us less than 30 seconds to find banned ‘deepfake’ AI smut on the internet

Don’t blame AI, blame the creeps who put famous heads on pr0n actors’ bodies

Such videos are made possible by neural network technology that can learn the features of anyone’s face and maps it onto bodies in videos.

Reddit appears to have been a prime mover in applying the technology to pornography. The site hosted a subreddit r/deepfakes, created by a member called Deepfakeapp, who created a desktop app to make it easier for people without much machine learning knowledge to create their own x-rated fantasy flicks.

The app is based on an algorithm made by another Redditor, Deepfake, who published some of his results back in December 2017.

undress - It took us less than 30 seconds to find banned ‘deepfake’ AI smut on the internet

Reddit has since updated its rules to forbid “involuntary pornography and sexual or suggestive content involving minors”, an edict that meant the end for both r/deepfakes and r/deepfakeapp.

The ban has, however, had limited effect: at the time of writing a search for ‘deepfake’ on Pornhub [a very NSFW site – Ed] produces a flood of video nasties.

Pornhub told Motherboard it has banned such content because it does not “tolerate any nonconsensual content on the site” and said: “Nonconsensual content directly violates our TOS (terms of service) and consists of content such as revenge porn, deepfakes or anything published without a person’s consent or permission.”

A spokesperson explained to The Register that it was enforcing the ban in a way that is “similar to how [it] handles other forms of nonconsensual content on the site.”

“We remove content that is flagged on Pornhub as soon as we are made aware of it and deem that it violates our terms of service. In regards to flagging nonconsensual content, either the person who deems the content nonconsensual or their legal representative can use the form to request removal of content and cite that they themselves did not contest for it to be uploaded.”

That response is, however, a part of the problem because relying on someone to flag such videos does not deter people from making more of them and uploading them to the internet.

And there are sites ready and willing to host deepfakes. One such, EroMe, describes itself as “the best place to share your erotic pics and porn videos”. A company representative told El Reg it didn’t see anything wrong with deepfakes and thinks of them as “parody”.

“It’s something new and for the moment we do not see any reason to be against [it]. If the people concerned are of age and the video presented as a fake we take this as a parody. If it hurts someone it’s always possible to send a DMCA (Digital Millennium Copyright Act) request and we remove the content.”

Our research also turned up deepfakes.cc, a site offering forums for people to publish their finished videos, post requests to make customised fake pornography, and even ask for tech support. It’s racked up over 2,000 registered members, and has even been cheekily caught mining Monero in the background.

There’s doubtless many more out there hosting this stuff, for either titillation or trying to make a buck.

Whatever their motivation, such sites’ willingness to expose deepfakes shows that AI is not to blame: whatever the technology allows, it takes people to put it to work. ®