The 'deepfake-style face swap app' ZAO has climbed to the top of Android and iPhone download charts in recent weeks. As its popularity grew, so have privacy concerns on Chinese social media, and now, beyond.
Here's how it works:
In case you haven't heard, #ZAO is a Chinese app which completely blew up since Friday. Best application of 'Deepfake'-style AI facial replacement I've ever seen.
Here's an example of me as DiCaprio (generated in under 8 secs from that one photo in the thumbnail) 🤯 pic.twitter.com/1RpnJJ3wgT
— Allan Xia (@AllanXia) September 1, 2019
The sudden wide adoption of ZAO is an “intriguing development in a country where mass surveillance and facial recognition technology are prevalent,” writes Jake Newby at radiichina.com.
“Some social media platforms, including WeChat, have now started blocking ZAO videos,” Newby writes in an update to his story on Monday. “WeChat has done this before with popular rival short video apps.”
Read the rest
The app — developed by Momo, the same company behind popular Chinese dating app Tantan — became an overnight sensation after it began circulating on Friday evening. Hashtags related to the app quickly became some of the hottest on microblogging site Weibo, while the app rocketed up the iOS download charts. Chinese social media feeds quickly became filled with ZAO-produced videos from friends and contacts for many users.
The premise of the app is pretty simple: take a selfie and put yourself into your favorite movie or soap opera (chosen from a pre-selected list of clips). Cue users giving themselves starring roles in Leonardo DiCaprio’s filmography or uninvited guest appearances on Game of Thrones.
As hilarious as it is to see Nick Offerman deepfaked into every part in Full House, it's ctrl shift face's videos of Bill Hader subtly taking on the appearance of those he impersonates which really get under my skin. As soon as you spot it, you realize it actually started some time ago, sneaking in under the cognitive veil like a knife through the ribs. Read the rest
Here's your daily dose of deepfaking and Keanu. Read the rest
An entire genre of Saturday Night Live-style skit humor--what if celebrity x were absurdly cast in role y?--is made obsolete by deepfakery. Read the rest
"When a Starbucks cup is the smallest mistake, you know you've fucked up." Read the rest
Another incredible deepfake execution. Ctrl Shift Face reports that his videos are demonetized due to ContentID claims by the original filmmakers: "Since I cannot monetize any my videos on youtube because of copyright claims, please consider supporting me on patreon. You'll gain access to a ton of exclusive content and also help this channel to survive. Thank you."
A particularly uncanny thing about how deepfake works is that it isn't the 1992 Stallone you might expect to see, but an ageless metastallone comprising elements of every age of Stallone.
As Theresa May continues to pilot the United Kingdom toward a catastrophic, epochal collision with the Brexit iceberg -- even as her ministers are busy slashing every available lifeboat -- Politics Joe have released a flat-out brilliant video casting the PM and her Minister for Ghastly Cosplay Jacob Rees-Mogg (that is, "Snoop Mogg") as the stars of a very Brexit version of Straight Outta Compton. Read the rest
"Who's to say that dreams and nightmares aren't as real as the here and now?” ― John Lennon Read the rest
The Dalí Museum in St. Petersburg, Florida has reanimated Salvador Dalí as a deepfake video experience. The "Dalí Lives" video installation opens in April on screens throughout the galleries.
As Dali once said, “[I] believe in general death but in the death of Dalí absolutely not. [I] believe in my death becoming almost impossible.”
From a press release:
The Museum began this immersive project by collecting and sharing hundreds of interviews, quotes, and existing archival footage from the prolific artist. GS&P used these extensive materials to train an AI algorithm to “learn” aspects of Dali’s face, then looked for an actor with the same general physical characteristics of Dali’s body. The AI then generates a version of Dali’s likeness to match the actor’s face and expressions. To educate visitors while engaging with “Dali Lives,” the Museum used authentic writings from Dali himself – coupled with dynamic present-day messages – reenacted by the actor.
On Motherboard, Brian Merchant's (previously) new science fiction story The Convoy poses an eerily plausible future for political deepfake hoaxing -- with James O'Keefe-alikes running the show -- that skillfully weaves in elements of the Innocence of Muslims hoax with the current state-of-the-art in high-tech fakery. Read the rest
SIGGRAPH is coming, when all the amazeballs graphics research drops, and the previews are terrifying and astonishing by turns (sometimes both!). Read the rest
Begun, the deepfake wars have.
As usage grows of FakeApp -- the software that makes it comparatively easy to create "deepfaked" face-swapped videos -- a couple of researchers have decided to fight fire with fire. So they trained a deep-learning neural net on tons of examples of deepfaked videos, and produced a model that's better than any previous automated technique at spotting hoaxery. (Their paper documenting the work is here.)
This is good, obviously, though as you might imagine the very techniques they're using here could themselves be employed to produce better deepfakes. Technology!
Read the rest
The results are impressive. XceptionNet clearly outperforms other techniques in spotting videos that have been manipulated, even when the videos have been compressed, which makes the task significantly harder. “We set a strong baseline of results for detecting a facial manipulation with modern deep-learning architectures,” say Rossler and co.
That should make it easier to spot forged videos as they are uploaded to the web. But the team is well aware of the cat-and-mouse nature of forgery detection: as soon as a new detection technique emerges, the race begins to find a way to fool it.
Rossler and co have a natural head start since they developed XceptionNet. So they use it to spot the telltale signs that a video has been manipulated and then use this information to refine the forgery, making it even harder to detect.
It turns out that this process improves the visual quality of the forgery but does not have much effect on XceptionNet’s ability to detect it.