January 29, 2008

EP Meta: Chapter One

by Noah Wardrip-Fruin · , 7:17 am

With today’s post of section 1.6, we’ve reached the first major milestone of the Expressive Processing review. The entire first chapter has now been posted. Given this, I’d like to ask for further thoughts about issues that have been raised — and also invite wider discussion.

Here are some of the comments that stand out most for me, thus far:

Both Ian Bogost and Barry Atkins found the first chapter’s self-referentiality (something much less present in later chapters) problematic.

Both Lev Manovich and Matt Kirschenbaum raised the issue that software studies (and, by extension, Expressive Processing) should be careful not to uncritically reproduce the structures of computer science.

With Terry Bosky, Lord Yo, and Nick Montfort there was a discussion of whether it made sense to lump Pong and Tetris together as “early” games. While this was sparked by infelicitous word choice (I realized I probably meant something more like “iconic”) some interesting points were made.

Finally, with Barry Atkins, Chris Lewis, and Nick Montfort we got into an in-depth discussion of The Sims, Myst, asset libraries, graphical processing, behavioral processing, and process intensity.

I’d be very interested in further discussion of these points, either here or back in the earlier posts. All of them will be on my mind as I do my next round of revisions. More generally, I’d also like to hear any thoughts about the first chapter as a whole. While, obviously, it remains for the rest of the manuscript to make good on the first chapter’s promise, I’m hoping that the promise is coherent and enticing. I also hope its argument, that it is worth paying attention to digital media’s processes for these reasons and in these ways, seems convincing.

Finally, I’ll take this “meta” opportunity to link to Ben Vershbow’s thought-provoking project announcement on if:book that wasn’t yet live when I did my initial project post. Since then I’ve also become quite intrigued to see what Kathleen Fitzpatrick is cooking up in a forthcoming paper on the history and future of peer review and upcoming MediaCommons projects. And I’m happy to see this project sparking discussion elsewhere, as summarized in blog posts from MIT Press (citing Info/Law, ReadWriteWeb, Scholarly Communication, Sources and Methods, and Voir Dire) and the Chronicle‘s Footnoted (citing Info-Fetishist, The Valve, and Progressive Historians). I’ve also noticed discussions at Computerworld, The Scientist, Weblogg-ed, Blog Herald, and Educational Games Research. I won’t respond to all the ideas here, but they will provide fodder for whatever I eventually write about this project. Any other interesting responses I should know about?

4 Responses to “EP Meta: Chapter One”


  1. RodeWorks » Blog-based peer review with CommentPress Says:

    […] Grand Text Auto ยป EP Meta: Chapter One […]

  2. Catching up and going places « Literature’s Next Frontier Says:

    […] Expressive Processing. A meta-discussion is to follow in the next phase, check it out. Here is the meta of Grand Text […]

  3. JR Says:

    Noah,
    I’d really like to hear a little bit more about the relationship of data to process, especially with your example of The Sims. This stems partly from Barry’s excellent point about the high number of art assets in The Sims (and high volume vs. high polys), which you begin to respond to further down in that series of comments. Given the popularity of The Sims expansion packs, one wonders what the ratio of new data to new processes in those packs is, and what that does (or does not) reveal for your argument. I’m thinking also of your graphics 1.3 and 1.4, which both have data and process in a single box, separated by a thin dashed, vertical line that visually suggests distinct categories with some degree of osmotic transfer (a line you admit is rather fuzzy in your footnotes in section 1.2 http://grandtextauto.org/2008/01/23/ep-12-authoring-processes/#22). That data and process intermingle, as you note, “does little to diminish the basic usefulness of the concepts,” and yet it does complicate arguments that one is more responsible than another for a game’s success. Recalling your DH2007 paper on the three “effects,” I suspect that your future chapters (which I admit to not having read yet) might discuss this relationship between data and process a bit more, but I think it might be worth forecasting it and or clarifying it a bit more here.

    Also, you noted in one of your comments that

    The simple point is that the original version of The Sims intentionally used simple graphics, and comparatively low-end graphical processing, rather than latest high-end lighting and so on — but this did not stand in the way of its success.

    but I’d rather think that the low-end system requirements *contributed* to its success rather than simply “not standing in the way,” what with more accessible requirements allowing for a broader base of users. A quick comparison of other PC titles and their system requirements might at least give a basis for comparison. Doing such a comparative search might also give you a sense of the data requirements for other games published in 2000 (even if in the broadest sense such as hard drive space requirements and so forth).

    For the easiest fix to the overall issue, I think you could simply state that The Sims’ success proved the viability of a combination of innovative, intense behavioral processing with a deep library of iconic art assets. It avoids the “voted in favor of” language that I think causes most of your problem in that paragraph.

    Jason

  4. noah Says:

    Jason, thanks for your thoughtful comment. We sometimes talk of the primary purpose of peer review as identifying the problem areas in a work (and, perhaps, considering whether there are too many problems for publication to be justified). I think this review has served that problem-locating function most clearly for the “process intensity” section of the first chapter.

    Personally, I think there may be a whole essay about process intensity in my future. This would provide the space to trace through the nuances of the different approaches to the idea. It would also provide an opportunity to mull over the difficult border cases for process intensity. For example, with thumbnail definitions of “process” and “data” one might come to the counter-intuitive conclusion that logical declarations are data even if they will determine the functioning of the system. I’d want to carefully consider this both for relatively specialized topics (such as declarative programming) and for more mainstream ones (the logical information, designed to function in the simulation, bundled with graphics and animation information for the more substantial elements of expansion packs for The Sims).

    That said, I think you’re right about pursuing a quick fix, for purposes of this book. First, I can drop the non-essential point about the simple graphics of The Sims. Especially after the success of the Wii, there are probably no readers who think that maximizing polygons, textures, and so on is required for a game to do well. Second, I can just talk about The Sims as a modern example of game that works well while having a complex, innovative set of processes at the core of its gameplay. That’s really the point that I need to make in this section.

Powered by WordPress