Late last year, a redditor called Deepfakes gained notoriety for the extremely convincing face-swap porn videos he was making, in which the faces of mainstream Hollywood actors and rockstars were convincingly overlaid on the bodies of performers in pornography.
Now, /r/deepfakes is filling up with convincing pornographic faceswaps of celebrities, and when they escape the confines of the subreddit, they get posted to tabloid sites as "genuine" sex-tapes.
The tool, meanwhile, is undergoing rapid development, making strides in usability and polish of its output, heralding a day, very soon, when we will see a lot of these fakes and struggle immensely to distinguish them from reality. They needn't be pornographic, of course — you could faceswap Gandhi onto the aggressor in a beat-down, or Mike Pence onto Pride leather parade-float dancer.
According to deepfakeapp, anyone who can download and run FakeApp can create one of these videos with only one or two high-quality videos of the faces they want to fake. The subreddit's wiki states that FakeApp is "a community-developed desktop app to run the deepfakes algorithm without installing Python, Tensorflow, etc.," and that all one needs to run it is a "good GPU [graphics processing unit, the kind that high-end 3D video games require] with CUDA support [NVIDIA's parallel computing platform and programming model]." If users don't have the proper GPU, they can also rent cloud GPUs through services like Google Cloud Platform. Running the entire process, from data extraction to frame-by-frame conversion of one face onto another, would take about eight to 12 hours if done correctly. Other people have reported spending much longer, sometimes with disastrous results.
We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now [Samantha Cole/Motherboard]