August 21, 2004
ISEA 2004: Critical Interaction Design Keynote
Wendy Hui Kyong Chun: “Control and Freedom: On Interactivity as a Software Effect”
Asked to address one of ISEA’s sins: Our tendency is to take work at interface value. To appreciate work because of its novelty, rather than the actual experience of work.
User-friendly interfaces conflate control with freedom. A version of freedom is emerging within politics, society, and computing that isn’t opposed to control. As in gated communities.
Three clarifications: (1) Not a condemnation of software and its interfaces. (2) Not denigrating freedom, but seeking a more rigorous version of it. (3) Not arguing that control is absolute.
She begins with her packet sniffer (Sniffles on MacOSX). Some of you may think that your interface controls your machine. But as you can see, your computer constantly wanders without you — sending and receiving messages (many, “can you see me?”) that don’t have any presence in the interface without a program like this. (Next talks about Back Orifice and Magic Lantern as the same thing. Isn’t one for remote control and the other a keystroke monitor?)
The time when it was impossible to advertise the Internet without featuring happy people of color. The Internet is screen deep. Race is skin deep. “People can communicate mind to mind. There is no race. Utopia? No, the Internet.” The escape from the body. Colored bodies, not cultures, are to blame for racism. Replaces the image of the Internet as teeming with kiddie porn. But then, after 9/11, it goes from “Yay, people riding camels in the desert have access to the Internet!” to “Oh Shit!” at the same thing. But the message is the same — get it, because they have or will.
Next, a potted history of interactivity. She points to the SAGE project as an origin point. For detecting incoming Soviet missiles, and obsolete by the time it was complete. She points to John McCarthy, talking about how LISP was designed to be worked with interactively (rather than in batch mode). Quotes from Engelbart and Nelson, pointing to freedom through ability to control information. Schneiderman on direct manipulation. Laurel’s note that direct manipulation must be complimented by direct engagement — emotional as well as cognitive values. Chun points out that for paranoid schizophrenics, everything has meaning. Sounds a bit like Laurel talking about the user being able to understand everything that happens on the screen happening for a reason. Agre on surveillance and capture.
Next, jumping the screen. What we consider to be programming today was not always considered programming. The belief in software as something readable and manipulable depends on software. In contrast, consider what is now called “direct programming” (plugging parts of the computer into other parts — image of the “ENIAC girls.”) When “automatic programming” was being developed, it was a move from commanding a girl to commanding an automaton. A move led by people like Grace Murray Hopper (a woman). Automatic languages are based on hiding knowledge, making the computer more secure (from the programmer). “Any programming method or approach that assumes that people will understand a lot is inherently risky.” The causal pleasure of object-oriented programming. Compared to what Manovich says about They Rule in Generation Flash. (But one is data mining made playable. And the other is not about exploring pre-existing data, but making something, right?) The connection is that they ask us to think similarly about leveraging things beneath the surface?
Now, ideology and software. Software conforms to almost every definition we have of ideology. Imaginary relations (to our hardware), false consciousness (The Matrix). Its defaults referred to as “your preferences.” Software produces users. Software is based on a fetishistic logic — the user knows well that the desktop is not a desktop, but goes along with using this language. We know well that we are engaging in something false, but go along with it through what we do, which screens the fact that authority is without truth, and we go along. The new rhetoric of interactivity may be more obfuscatory than liberating. But the parallel of software and ideology is purely formal, and in this drains actual theories of ideology of their power, their critique of power. Yet we see how this type of parallel runs through society. Nurture is software, nature is hardware.
Important to say that she’s not advocating a closer relation to hardware, nor condemning all interfaces. Work here at ISEA, and Mongrel’s “Heritage Gold.” Free software can also help us develop the more rigorous idea of freedom mentioned before. But it can’t rest with the GPL, but must also think at the level of the interface, and question why public and private has been subsumed into closed and open. To return to Laurel, using a system that enables anything “might be more like an existential nightmare than a dream of freedom.” But what is real freedom but an existential nightmare? We must reject definitions of freedom that reduce it to a product in a “free market.” We must take seriously the vulnerability that comes from true communication, so we can create free systems with which we can live.
(Simon Penny has his hand raised. I hope they call on him. Nope.) Question about schizophrenia, Foucault and docile bodies — extending to a second question about software and ideology. Hey, it’s Jennifer Gonzalez, I think! (Yep.) Wendy says she didn’t follow up schizophrenia more because she’s focused more on paranoia. Interested in Lacan on paranoid knowledge — getting beyond control as freedom. Also Daniel Paul Schreber and how much it sounds like fiber optic networks. Not saying “we’re all paranoid schizophrenics.” But rather that these moments point to where our knowledge ends.
Question about how ideology changes, within software, in the move from functions to objects. What about glue languages? Wendy says she was focusing on the way they’re both imperative, but it’s important that objects hide more, are more about abstraction. How odd, she says, that in CS this is considered to be empowering for the programmer. Yes, it’s important to look at the specifics of languages. Scripting languages, compiled languages, the oddnesses of LISP. Talking about how strange she found the metaphors of APL when she was first learning it at IBM.
Marc Davis has the first of four quick questions that will be answered at once. He’s wondering about notions of the subject. What’s the role of the inscription of the subject in this software as ideology model? Simon Penny wants to probe her assertion that ubicomp extends the screen. Another question: what’s the difference between software and language, given that both can be carries of ideology? Then Sara Diamond, asking about visualization, which has existed long before digital visualization, and doesn’t metaphor and the imaginary have to have a place. Then Tapio has to slip in one more thing: Shouldn’t there be a culturally-specific history here? Didn’t your quotes all come from a particular period in the US, with culturally specific notions of freedom?
Wendy says: Working against notions of freedom that de-subjectivize (Bush says freedom is free trade). Simon, your work is an excellent example of moving beyond the screen. The whole idea that code is law (Lessig) is a lawyer’s wet dream. Visualization shouldn’t have clear causality — that’s the problem, the logic of scientific imagery. Yes, we should have culturally specific histories.
They’ve cut things off — the previous session went late. Too bad.
See also notes from Axel Bruns.
Ah, and now another presentation in this session. Nina Wakeford: “The Identity Politics of Mobility and Design Culture.” See Jill’s notes.
April 6th, 2006 at 12:24 am
[…] iffer and webcams, and some rather suggestive imagery. In terms of the book itself, if her ISEA keynote (pdf) is any indicator, it’s going to be a […]