January 9, 2006
I have only brief notes from one session of the Modern Language Association Convention this year – the December 29 one on new media editing, chaired by Neil Fraistat of the University of Maryland, in which I presented “Toward Scholarly, Critical, and Variorum Editions of Computer Programs.” This was as tedious a paper title as one can imagine (sure to drive both computer enthusiasts and those in textual studies into slumber), but the other two speakers more than made us for this, presenting new interfaces to motion pictures (Stephen Mamber, UCLA) and a compelling take on how to approach video games via bibliography (Steven Jones, Loyola U. Chicago).
In my talk, I argued that “computer program” is a useful concept, and focus, for humanists studying new media. Computer programs are the most general, powerful formulation of what computers can do; they are expressive, and have been used for artistic and literary purposes for quite a while; and “computer programs,” as a concept, focuses on an important part of a system, just as the concept of a text provides focus. My hope was to build on Matt Kirschenbaum’s work in his 2002 article “Editing the Interface: Textual Studies and First Generation Electronic Objects” (still not online! arg!) by looking at computationally intensive works – not just Myst and Dragon’s Lair, addictive as they might be, but The Sims and Fable. I discussed re-issues (the 20th anniversary Infocom Hitchhiker’s Guide to the Galaxy), ports and reimplementations (to more powerful and less powerful platforms), mods and engines (from Quake to Half-Life to Counter-Strike and mods of it), and bug fixes. I also presented some examples of fan bibliography, such as the Infocom Bugs List, and commentary on source code – the Lions book being the main example there. I concluded with some specific editorial challenges for new media work and some discussion of the Electronic Literature Collection.
Before we began, the panel was photographed by a spy robot that briefly occupied the body and cell phone of Matt Kirschenbaum.
Stephen Mamber, a professor in the UCLA Department of Film and Television, showed several new ways to study and experience film, using the computer’s capabilities to help viewers question and think about cinema. One interface presented tiled pictures that could be resized – each one the first image in a shot from Hitchcock’s The Birds. Clicking on an image would play the shot, but the interface also made it easy to select a portion of the film to view and to look at a segment of it shot-for-shot. Another interface was hooked to Kubrick’s The Killing, augmenting it by offering a narrative map of the characters’ overlapping stories and a 3D models of the spaces in which the scenes took place. Finally, we saw the Center for Hidden Camera Research, where new media interfaces are used to interrogate surveillance footage.
Steven Jones, a professor of English at Loyola University Chicago, spoke next about how bibliographic methods could be thoughtfully used to understand video games – not treating them, of course, as books or purely as narratives. “Video” isn’t at the core of video games, he said, but serves to separate this object of study from textual studies. Video games, far from being self-contained, include encoding, platforms, social networks of players, and marketing and reception, as his main example, I Love Bees, demonstrates. This alternate reality game should be not only studied for its own sake, but should be part of the consideration of Halo 2, which it was designed to promote. Taking a step further, he suggested that ARG players act like textual editors in assessing evidence, comparing variants, and seeking something definitive. Bibliography can move beyond looking at variants and ports of games and into the lives of players.
Also of note from this year’s MLA: Some scenes from the Electronic Literature Organization get-together have been captured and placed online.