Automated video news program from RSS and game-graphics

News at Seven is a mind-blowing automated news-video project from Northwestern University. It pulls news stories in from RSS feeds, digs up video and still images, and then composes a story that's "read" by a video-game character from Half-Life. It even cuts away to a different reader narrating posts from blogs related to the day's story. The text-to-speech engine could be a little clearer, but man, I don't think I've shouted "Woah" more times during a three minute clip in months.

Totally autonomous, it collects, parses, edits and organizes news stories and then passes the formatted content to an artificial anchor for presentation. Using the resources present on the web, the system goes beyond the straight text of the news stories to also retrieve relevant images and blogs with commentary on the topics to be presented.

Once it has assembled and edited its material, News At Seven presents it to the audience using a graphical game engine and text-to-speech (TTS) technology in a manner similar to the nightly news watched regularly by millions of Americans. The result is a cohesive, compelling performance that successfully combines techniques of modern news programming with features made by possible only by the fact that the system is, at its core, completely virtual.

Link

(Thanks, Matt!)