January 14, 2006

OpenEnded Lingering

by Andrew Stern · , 2:06 am

Marc Downie's dance pieceMy friend eugene sent me a link to an artificial.dk interview with Marc Downie — whom you interactive character enthusiasts may know as the person responsible, among other things, for the excellent realtime charcoal rendering of the MIT Media Lab’s virtual wolves. A little research reveals that Marc recently defended his PhD dissertation, “Choreographing the Extended Agent: Performance Graphics for Dance Theater”, and is now part of a collaborative group of artists called OpenEnded Group. The interview describes How long does the subject linger on the edge of the volume…, a dance piece with live interactive imagery, pictured here. Cool stuff.

(AND the research reveals Bruce Blumberg, head of the once high-profile, now-defunct Synthetic Characters group, is now Director of Advanced Animal Modeling at Blue Fang Games! “…Creating the technology, tools and processes that will enable Blue Fang to create expressive, intelligent and engrossing animal characters that will set the bar for the next generation of digital entertainment experiences.” Interesting how so many interactive character researchers over the years have left academia for industry.)

Exploring artificial.dk a bit more I found a new series of articles on Art Games worth checking out.

One Response to “OpenEnded Lingering”


  1. michael Says:

    OpenEnded Group came to Atlanta a couple of months ago and gave a talk at Georgia Tech. It was great to see Marc Downie again and meet the other members of the group. While I enjoyed seeing documentation of how long does the subject linger at the edge of the volume… and hearing them talk about it, the piece I was really struck by was Enlightenment, an in-progress piece that will open this summer. In Enlightenment, the system “meditates” on the coda of Mozart’s last symphony, looking for patterns and generating alternatives. The long-term processes of the AI system are made visible on several large displays at the entrance to Lincoln Center. The generated imagery on the displays captures, in an abstract visual language, the current search-state of the system, the patterns it is exploring, and so forth. The displays open a window into the AI system’s “mind.” Back when we were working on Terminal Time, we toyed with the idea of generating an animated visual display depicting Terminal Time’s history generation process; the idea was to provide an aesthetically engaging display that provided insight into the processes of ideologically-biased reasoning that produce a generated history. While we weren’t able to implement this for Terminal Time, I’ve long thought that opening windows into artificial minds would be a great direction to explore. Back in high school I had the Chess cartridge (among many others) for my 2600; as many of you recall, Chess would fill the screen with solid colors while thinking, flickering through colors as the search process proceeded. Back then I enjoyed staring at the colors and trying to figure out from the color patterns what might be going on in the machine. While I know that for 2600 Chess the flickering color patterns were most likely nothing more than a screen saver (literally, as in preventing raster burn-in), that was what first got me thinking about opening windows into the mind of the machine. I’m glad to see that OpenEnded Group is doing a project in this space.

Powered by WordPress