May 31, 2003

(Sharing) Control

by Andrew Stern · , 2:22 pm

In the comment thread of Narrative as Virtual Reality, Lisa asks, “Why would an author *want* to yield the authorial control of a piece to some sort of AI engine?”

That’s a really good and interesting question. To me the issue of who has control (or, similarly, agency) of an interactive artwork is primary. There seem to be at least 3 parties who could be in (or share) control of an interactive artwork: the original author, the user, and the work itself (e.g., the AI).

The original author can of course exert control over the experience by establishing what content is there in the first place, and the means to access the content (e.g., the interface, the user actions offered). The user has control over the experience within the limits circumscribed by those actions and interface designed by the author. That is, when the user plays a game or reads a hypertext, obviously the user is exerting control over what content occurs in this session of the experience, effectively participating in defining the experience. Additionally, if the original author decides to do so, the system itself can be programmed with enough autonomy / generativity to make some/many decisions about when content occurs, how content is accessed, or even what the content is in the first place. In other words, the system (AI) can become a co-author. Or, in the case of extremely generative artwork, the AI itself could be considered the author, and the human who programmed the AI is some sort of meta-author.

So to your question, “Why would an author *want* to yield the authorial control of a piece to some sort of AI engine”? There’s a certain joy in being a co-author with an AI of your own creation. It’s like a bit like collaborating with someone else, except in this case you’ve created your collaborator. (Let me say, although the AI’s I’ve worked on are meant to behave in lifelike ways, I by no means view those AI’s as more than complex machines. However I aspire to create an AI that is more intelligent, that does more to surprise me with what it generates.)

So, it’s a partnership, really. Perhaps you’re worried about a scenario where the human author somehow relinquishes *all* control over to the AI, losing their own authorship. Well, even if the AI is very generative, the human is still a meta-author of the experience; *someone* had to program the AI in the first place. But even more typically, especially in the case of digital fiction in the near and not-so-near future, a human author will be supplying much of the content themselves (ideally in very small grain size pieces), and allowing the AI to sequence, combine and recombine that content, and perhaps generate some new content too.

3 Responses to “(Sharing) Control”


  1. nick Says:

    This sort of comment sounds to me like a painter or photographer complaining about an architect designing a building. The former sort of person is used to presenting a single two-dimensional perspective; the architect creates a structure and a space or spaces that someone can inhabit, spaces people can wander through and see from many different perspectives. Why would someone want to give control of the perspective to a three-dimensional structure and to another person who can do this sort of wandering? Well, because buildings can be nice to inhabit and to walk through. That doesn’t mean that paintings and photographs aren’t also nice. They’re different.

  2. andrew Says:

    Yeah, good point, that analogy has truth to it… But the “difference” in this case is of a different nature than, say, 2D vs. 3D. I can see why it would seem especially unsettling for an artist to feel like he or she may lose authorial control to “some AI”. Artists, particularly those who make pieces that get “used” by the audience – eg, architects, new media artists – have always had to “share control” of the art experience with the audience, and to some extent, with the artifact (eg, the artifact may break down). Now, with AI-based artifacts, the artifact itself is very autonomous and able to assert control more than ever before… yet another cook in the kitchen…

  3. chrisf Says:

    This is an old post, but it’s been referenced by this one, and addresses a point that is more relevant to what I’m interested in.

    The idea of ‘sharing control with an AI’ loses its weight if an author has control over the functioning of the AI. An understanding of AI methods, and how to manipulate their inner workings would be a valuable tool for authors of interactive narratives.

    Interactive narrative tools should thus include a simplified interface into an AI drama manager so that the rules that dictate the AI are open for manipulation. The AI is a tool just as any other programming tool.

    Most AI algorithms are pretty simple at their core, and work through a set of simple rules that work together to create a complex system. The basic rules of simulation systems (like in SimCity) work in much the same way.

    If an author creates an AI avatar for their own authorial wishes, the control is not so much shared with the AI than it is shared with the player of the interactive system.

Powered by WordPress