February 18, 2004
On Academia – Industry Conversations
Chaim Gingold wrote this month’s IGDA Ivory Tower column.
Game developers and academics, by engaging one another, can help both of their practices mature. But what does it mean to have conversations with one another? If we’re going to play together, what are the rules of the game, and what are the motivations of its players?
February 18th, 2004 at 2:53 am
Bleh. Sycophantic blatherings.
February 22nd, 2004 at 3:49 am
I think some of the most fundamental points about the lack of dialog between academia and the games industry have been missed in this article. It mentions the contrasting fortunes of academia and graphics professionals so I assume we are talking here about particularly game play and behavior in games. I am not an expert on game play but I can offer my thoughts on how academia and the industry can mutually benefit and assist each other in the area of more engaging behavior / interaction.
What they need
Game Developers: Theories (how to’s) on adding more engaging behavior and interaction into real game environments (no toy problems or closed worlds).
Researchers: Real game environments in which to test their theories of behavior and interaction.
Sounds like a nice fit, but we don’t have it.
While there are freely available game engines these are not the kind of thing that is really much use to most researchers. They currently are solely focused on the graphical aspects of the game world and its mechanics and leave the behavior and interaction to hard programming. This is beyond the resources of most departments I would assume (how many have a full time game programmer on hand?).
What is needed is a complete Game Environment into which researchers can easily “plug in” modules that test their theories. This would require the kind of toolkit that as yet is not available. What this would offer however is a “common currency” between industry and academia, a common platform on which to share ideas.
The problem being, for researchers, is that a useful test platform on which to test their theories is infinitely more complex to construct than their actual theory. They could spend all of their time on this aspect alone while their research gathers dust (and I speak from painful experience). Creating in a “real” world would allow for the construction of a full application rather than merely a theory.
For industry developers the problem is one of translating a theory that has been tested in a closed world into their massively complex environments along with the accompanying emergent behavior [read: unknown behavior]. How a theory relates to actual behavior is difficult to imagine unless you can actually see it and try it. Also having only been lightly implemented (if at all) the route from theory to a full product implementation could be tortuous. As I am sure most of you will agree, the devil is in the details and this is not apparent until you actually try and apply a theory in a real environment. Developers then need a full set of “blue prints” and not just a theory to help them with implementation. Graphics developers know how to take a simple algorithm and add it to their games, for behavior that route is not yet well understood.
Interestingly the article makes reference to the theory of movies and how researchers could make their own movies to dissect them which is a technique that is not available to current researchers. This is an analogy I use often with the game industry in that, in effect, the current practice in studios (while changing) is to throw away your “camera” after every movie and spend the next year of your next movie first building a new “whiz bang camera”. By camera I am referring to Game Engines and tools and is an unsustainable position. The game industry, as well as academia, needs to move away from building cameras to building games (i.e. having tools that allow them to concentrate on the core aspects of game play, behavior and interaction not GPU addressing).
In terms of conferences I am not really surprised that a conference for Game Developers and Academia that was held in the Netherlands in November was not well attended. There is a world of difference between the pressures of the two parties. Often game companies have no budget or policy for having staff attend conferences so that means you go on your own time on your own dime (often you have neither). For academics conferences are part of the job. It would be reasonable then for academia to do whatever it could to accommodate developers (who I know would love to attend and increase their knowledge). That means conferences in the few centers of game development at times that are amenable to developers (i.e. not between September and January). The most obvious move to me would be to build on top of GDC, either before or after. This is a conference that is well known to industry and may allow for budgeted involvement. It would allow for the easiest cross pollination between research and development. Similarly it could be done the same way for the premier European and Asian Game Developer conferences.
Thought of the day; we need a common currency in which to trade ideas. That is a common time a common place and most importantly with a common platform.