Researchers at Stanford have developed face-capture technology that can alter pre-recorded videos in real-time on low cost computers. In other words, you can make George W Bush or Donald Trump appear intelligent.
Read the rest
We present a novel approach for real-time facial reenactment of a monocular target video sequence (e.g., Youtube video). The source sequence is also a monocular video stream, captured live with a commodity webcam. Our goal is to animate the facial expressions of the target video by a source actor and re-render the manipulated output video in a photo-realistic fashion. To this end, we first address the under-constrained problem of facial identity recovery from monocular video by non-rigid model-based bundling. At run time, we track facial expressions of both source and target video using a dense photometric consistency measure. Reenactment is then achieved by fast and efficient deformation transfer between source and target. The mouth interior that best matches the re-targeted expression is retrieved from the target sequence and warped to produce an accurate fit. Finally, we convincingly re-render the synthesized target face on top of the corresponding video stream such that it seamlessly blends with the real-world illumination. We demonstrate our method in a live setup, where Youtube videos are reenacted in real time.
Industrial Light & Magic’s Experience Lab (ILMxLAB) is a newly-formed supergroup of artists, engineers, sound designers, and storytellers prototyping the future of interactive, immersive cinema for Lucasfilm. Over at Bloomberg Businessweek, I wrote about my visit to the xLAB
where The Force is quite strong:
"The way we do technology development here is really hand-in-hand with the creative goals,” says (Lucasfilm CTO Rob) Bredow. “The R&D is always in service to the story.”
For example, to port the Millennium Falcon from the Star Wars film universe into the interactive realm, the Advanced Development Group engineers first had to figure out how the VR hardware could render the massive 3D model in just milliseconds, compared with hours or days for a film shot. Then Skywalker Sound built a surround system that realistically rumbles and whooshes as a Corellian starship should. Meanwhile, game designers and the storytellers hashed out the most compelling way for a Jedi-in-training (you) to battle an army of Stormtroopers with a lightsaber.
"THE SUPERGROUP REMAKING STAR WARS AND JURASSIC WORLD IN VR" (Bloomberg Businessweek)
Read the rest
Above, "Game of Thrones" before the computer graphics.
Guardians of the Galaxy:
Pirates of the Caribbean:
Life of Pi:
The Twilight Saga: Eclipse:
See more at Design You Trust. (via Neatorama) Read the rest
Tim Fitzgerald shot video of naked people. Then he shot video of the same people holding the same pose, but wearing clothes. Then other people smeared and splashed green paint on the people's clothes, and Fitzgerald green screened the two superimposed videos to make this amusing and NSFW short film. Read the rest
It's an age-old complaint about video games and films: bad graphics make them suck. But plenty of classic entertainment holds up even if the effects don't. RocketJump Film School examines the issue in a brisk overview. Read the rest
In 1986, Industrial Light and Magic artist John Bell designed what would become an icon of Hollywood futurism, Marty McFly's Hoverboard from Back To The Future 2. Read the rest
From JJ Abrams's fantastic tribute to Dick Smith, the pioneering SFX make-up artist behind The Exorcist, Scanners, Little Big Man, and so many more:
Read the rest
The Making of Raiders of the Lost Ark, aired on PBS in 1981. Read the rest
The death of stop-motion animation pioneer Ray Harryhausen raises questions about the future of special effects, writes Ethan Gilsdorf. In the good old days, it did not take so much to trick the eye.