February 23, 2007
The work of AI-oriented Northwestern University grad students has been interesting for quite some time (1 2 3) — and here’s a new project to add to the list. Nate Nichols and Sara Owsley, in Kristian Hammond’s InfoLab, have created a system called News at Seven, that intelligently and autonomously combines news text, images and video from the Web, related commentary from the blogosphere, avatars from Half Life 2, speech synthesis, and broadcasting via YouTube, to create a daily short newscast. It’s an AI-based mashup.
Exploring the archives, I think they only recently tuned the system to create a tighter shot on the news anchor, which is a good thing. I kind of like the remote camera for their “blogosphere reporter”, a guy standing against throngs of bland voices in a snowy dystopian wasteland. (A somewhat harsh characterization of the blogosphere perhaps, and more a product of the limited number of avatars available in the Half Life 2 engine, but amusing nonetheless.)
Lots of interesting papers by this group to be found on their site.