October 23, 2004
Wired and the vision thing
It’s a sad fact that, in the mid-1990s, as the field of digital media anticipated by Ted Nelson’s 1974 Computer Lib / Dream Machines exploded in size, the book was out of print and many new to the field were largely unfamiliar with Nelson’s work — and quite a few even with his name.
Wired magazine, the most prominent publication for new and aspiring ’90s ‘digerati,’ ran a story in June 1995 that introduced many to Nelson’s work. Unfortunately, the piece was dedicated to making Nelson out in the worst possible light — beginning with its title, ‘The Curse of Xanadu.’ Nelson was called ‘the king of unsuccessful software development.’ (I won’t link to the article, but you can find it via web search, if you’re looking for drivel.)
There are many ways of disputing the presentation of Nelson in Wired‘s article, but at this moment it might be more interesting to make a comparison with a figure from digital media’s history that Wired has presented rather differently — Nicholas Negroponte. Wired has identified Negroponte, among many glowing appellations, as ‘the Media Lab’s visionary founder.’ My question here is: What made Nelson ‘unsuccessful’ and Negroponte ‘visionary’ in Wired‘s estimation?
We can start by looking at the influential mid-1970s books written by the two. Nelson’s is Computer Lib / Dream Machines, and Wired correctly identified Xanadu as Nelson’s biggest dream within it. Negroponte’s book was the 1975 Soft Architecture Machines, a follow-on to his 1970 The Architecture Machine. As one might guess from the titles, the Architecture Machine was Negroponte’s biggest dream outlined within his book.
Wired was perfectly correct to point out that Nelson’s Xanadu never shipped. The Architecture Machine — also never shipped. So clearly this is not where Wired‘s difference lies between unsuccessful and visionary.
Another possible difference we might identify is in the level of effort expended. Xanadu has never shipped after tens of person years invested, many with funding from Autodesk. The Architecture Machine, on the other hand, has never shipped after thousands of person years invested — by the MIT Architecture Machine Group and Media Lab — with funding from MIT, the Department of Defense, the Walt Disney Company, Philip Morris, and a list of additional sponsors that once famously required a tiny font to fit on the Media Lab’s promotional notepad.
One can’t be but a bit surprised that, given this, Wired describes Xanadu as ‘the longest-running vaporware project in the history of computing — a 30-year saga of rabid prototyping and heart-slashing despair’ while calling the Architecture Machine ‘a great example of interactivity’ that ‘obviously faced a few short-term fabrication constraints.’
But perhaps if the word ‘visionary’ is on the table we might do better to examine the substance of the visions of Nelson and Negroponte. Nelson’s Xanadu imagined personal computers as portals into a world-wide hypermedia publishing network. Negroponte’s Architecture Machine imagined personal computers as artificial intelligence-driven hyper-personalized assistants. Nelson’s vision, of course, now seems prescient — since the explosive growth of the Web and other forms of network media. Negroponte’s vision, unfortunately, hasn’t fared as well.
So who’s the visionary?
Of course, that’s the wrong question. And much of the above is a collection of red herrings. Ivan Sutherland’s Sketchpad never shipped, Doug Engelbart’s NLS never shipped, Alan Kay’s Dynabook never shipped — and yet we’re agreed that these were founding visions for the field of digital media. The same is true for both Xanadu and the Architecture Machine.
No, the real question is when we will recover from the mid-1990s ‘Curse of Wired‘ and the distorted view of digital media’s history that this overly-influential publication offered.
October 23rd, 2004 at 3:52 pm
And of course it’s misleading (though perhaps no more so than much of Wired’s article) to suggest that the MIT Media Lab’s work was primarily aimed at constructing the Architecture Machine. And mention of the Media Lab is a good reminder that Negroponte has also made significant contributions to the field of digital media through his institutional work — for example, convincing the iconic and powerful MIT to offer a degree in “media arts and sciences.”
October 24th, 2004 at 8:23 am
I first read about Xanadu in an interview with Nelson conducted by John Perry Barlow which was published in the August 1991 issue of Mondo 2000 (a much wilder and more interesting precursor of Wired). That piece painted a much more sympathetic picture of Nelson’s quest than the one in Wired did; however, like most of the Mondo articles, it doesn’t seem to be available online (you can find a history of that magazine
here, illustrating, among other things, how Wired stole Mondo‘s thunder). What I did find, though, is an interview with Nelson from 1996 entitled “Orality and Hypertext”, where Nelson replies to the “Curse” article, calling it “libelous”:
An intense guy. As for comparing him with Negroponte, I think that Nelson (and Mondo) might have been too intense for the Nineties, where more pragmatical minded entities like Negroponte (and Wired) looked like they would win the world, or at least acted that way. Times change, though; I’ve not heard much of Negroponte lately, whereas Nelson sightings are seemingly reported from every corner of cyberspace ATM.
October 24th, 2004 at 2:29 pm
Here’s an afterthought, or rather, an after-question: according to Nelson’s definition, hypertext is “text that branches or performs on request”. Now, to me, a computer program is nothing but a form of text, and branching or performing on request is something that computer programs usually do. Would it be wrong to say, then, that “any computer program can be seen as an example for hypertext?” If so, the why?
Another, somewhat related question: can chatterbots be seen as an example for hypertext?
October 24th, 2004 at 3:02 pm
Dirk, good questions. A couple quick thoughts. Nelson makes a differentiation between hypertext, which is a type of hypermedia, and computer programs that aren’t media. For example, in his 1970 “No More Teachers’ Dirty Looks,” he refers to visual calculators as “facilities.” He also strongly argues that this media is authored by people (rather than automagically generated by AI). That’s why hypermedia was presented as an alternative to AI-driven models of computer-aided instruction (in that same 1970 article). Of course, that can be a slippery distinction. Michael and Andrew are using tools that have an AI history, but they’re using them to create authored media…
October 24th, 2004 at 11:28 pm
I read the MONDO 2000 article and the Wired articles when they were both first released. About three weeks ago, while continuing my research for PlaySpace, I landed on some work by Mr. Nelson. I was astounded by the brilliance of his work. I’m hopeful that the best work and the best ideas find a way to overcome the shouting, fear mongering, and base marketing efforts that seem to win out all too often. In some of Mr. Nelson’s ideas, I see a far better world than the one I live in today.
October 25th, 2004 at 1:04 am
Kenneth, I think you and Dirk (in his comment about the 90s) may both have something there. People like Nelson (and Doug Engelbart) who struggled without significant research funding or widespread recognition through the last decade now seem to be reaching a wider audience. There’s hope.
BTW, could I ask which work of Nelson’s it was you came upon recently?
October 25th, 2004 at 4:50 am
Thanks for clearing me up, Noah. I think it really is about time that Nelson’s early works are made more widely available. His 1970 distinction of texts that are media and texts that are not seems arbitrary when viewed from a 2004 point of view and, in my opinion, complicates matters too much for me to follow it. However, if I read the history right, nobody who was in CS in 1970 saw AI as an authored medium – everybody (and this seems to include Nelson) expected it to be some kind of magical “othermind”. “Othermind” as in “non-human”. Looking, for instance, at this 1969 paper by John McCarty, and the distinctly non-human notion of “general intelligence” it presents, it’s easy to see why a more humane computer scientist like Nelson would have wanted to make some distinction between that kind of “facility” and the authored text that he had in mind.
Times are different now, however; those magical “facilities” that the erstwhile AI mainstream (McCarthy et al.) had planned on never did materialize, and it has become difficult to defend any computer program as “not authored” by human beings, even if said program does generate new code it then works with (“co-authorship” or not? – but this should be discussed seperately…). The “magic” has evaporated; what’s left is AI as a medium, comparable to traditional media in content (examples: comparing the content of Cyc to that of the Encyclopedia Britannica; comparing the content of Façade to that of Who’s Afraid Of Virginia Woolfe), only different in interface. If I ever meet Ted Nelson, I’ll ask him if he thinks that the distinction between “hypertext” and “facilities” is still necessary today. Unless somebody beats me to it :-)
October 25th, 2004 at 9:53 am
Dirk, sorry — I think I was unclear.
Nelson’s distinction between “hypermedia” and “facilities” is different from his distinction between human authorship and AI.
“Facilities” (as I read Nelson) are tools, rather than media. So this would include calculators, spreadsheets, etc. But this is also a distinction that’s becoming trickier as we get further from 1970. I guess it means that some web pages, for example, are hypermedia (interactive poems) — while others are facilities (online forms), and some might be both.
As for views of AI vs. authorship, I think on some level this debate is still alive. Nelson was arguing against a view that says, for example, “Let’s make a model of how learning really happens in the brain and then use that to determine what to show the student next.” He was arguing in favor of a view that says, for example, “Let’s think about the progression of this as a media experience and use our knowledge as authors and designers to decide what to show next.”
October 26th, 2004 at 2:49 am
Well, a simple post-marxist analysis is in order. First of all, Negroponte, represents the most monied part of one of the world’s most monied institutions. Money was essential to the Wired vision. In ’97, when you and i were at NYU, i published a critique of the cheerleading function of his column at Wired [http://mrl.nyu.edu/~andruid/ecology/theory/lexicon/sign.html]. Some of us like our technology blended with critique, as well as creativity.
Yet certainly, as you observe, the move to create a degree called “media arts and sciences,” was obviously a positive development that very few institutions have been able to replicate, despite the obvious sense of it.
Nelson was relatively itinerant, making him an outsider in the Wired hyperreality of the Net as economic opporutnity. His essays on hypertext and non-linear reality articulate specific concepts rather clearly. While Computer Lib/Dream Machines has been unavailable, at least A Filesystem for the Complex, Changing, and Indeterminate is in the ACM DL.
While the work of both of these men has been influential, a competition seems off target. We don’t need to figure which leaders to follow. Just to keep developing concepts and making stuff.
(unless the point is to make an argument with a book publisher, in which case, yes, Computer Lib / Dream Machines should be in print on its own, in full.)
October 26th, 2004 at 2:18 pm
I’d take Wired’s superlatives towards NN with a grain of salt – we can argue over whether he deserves them, but what else would they say about someone who was not only a senior columnist, but a founder and an investor in the magazine as well? (cf. Thomas A. Bass, “Being Nicholas”. Wired, 11/95. )
As for Nelson, the personal story aspects of the Wired piece may be too mean, but I think their take on the project is right on. Why has nothing come of the man-eons put into the system? Surely it’s not the technology that’s holding him up – programming is always the easy part. The problems are conceptual, and they’re huge.
October 26th, 2004 at 3:59 pm
Rob, nothing has come of the vision of a world-wide hypertext publishing network? Then I guess we should have similarly-negative articles about Ivan Sutherland’s Sketchpad, Alan Kay’s Dynabook, Doug Engelbart’s NLS, and just about every other system that formed and predicted what our field has become (because few of them ever shipped as products in the forms worked on by their creators).
Or do I misunderstand you?
October 26th, 2004 at 4:09 pm
Andruid, I think you’re right. Negroponte fit in well with Wired’s ideology (and, as Rob points out, was closely connected to them in other ways) while Nelson felt a little too Whole Earth Catalog to them (despite his basically libertarian outlook).
And, to give credit where it’s due, I seem to remember it was you, in a Williamsburg bar some years ago, who pointed out to me the importance of Negroponte’s contributions on an institutional level (and the surprising fact, even then, of how few institutions had followed).
October 26th, 2004 at 8:33 pm
Noah,
I meant: nothing has come of their software. They’ve been writing it for decades now. Nothing, including Windows, takes that long. What happened? (Maybe something as simple as Spolsky-rant-inducing design mistakes, but who really knows?)
Some of the ideas were very interesting, and have been part of popular computing for some time now: hyperlinks, document versioning, even piecemeal data construction, as manifested in database-backed websites.
But some of their essential desiderata seem fundamentally flawed. Some flaws are technical – the fragment exchange and document versioning would require that all Xanadu servers make up a single global distributed database, and problems with distributed databases are legion (coherence issues, maintenance issues, etc).
And some are social. For example, the ability of anyone to link comments to anything anywhere. This would very quickly diverge into a kind of a global Wiki, along with all of the associated social problems. Worse yet, unbreakable links would mean that people can’t host their own content – the entire system would have to include measures that actively prevented authors from deleting old content. That strikes me as a hard sell. :)
Still, even with those problems, they declared neither defeat nor victory, and are still working on it. And perhaps that’s what’s most disheartening. There’s no sensible reason why a first release of any piece of software should take so long.
Sorry for the rant. :)
October 26th, 2004 at 10:11 pm
Rob, I disagree with you, but we may have to agree to disagree. I think some of our disagreement may simply come from the fact that I’ve read more of Nelson’s material. For example, on the global wiki issue. Nelson talks about keeping others from making links as “back end filtering” whereas he advocates “front end filtering.” You could set up your front end filtering to only show those links created by the document’s author — and get the linking model we have now. But maybe you’d also like to see any links created by someone in your organization. And perhaps those made by your friends. And perhaps those made by nonprofits with which you have sympathy. But Nelson agreed with you that people wouldn’t want to see every link anyone might make.
At the same time, I don’t mean to give the impression that I think Nelson was right about everything. For example, he said that there would be no central directory, and instead people would have businesses that were competing directories. So far so good — we have Yahoo, and Google, and so forth. But what Nelson didn’t foresee was that statistical methods would clobber hand-creation of directories. That is, he didn’t foresee that we would search the web with Google much more often than we’d traverse Yahoo’s site categorization hierarchy.
As for the fact that people are still interested in pursuing Xanadu, I guess I just don’t see your point. There’s no giant Microsoft-style team working on Xanadu. Engelbart still uses NLS, Kay still pursues his Dynabook goals through Squeak. But, again, I sense this isn’t something we’re going to see eye-to-eye on.
October 27th, 2004 at 1:18 pm
Noah, thanks for the clarification! BTW, regarding the not-seeing-eye-to-eye thing: I’ve nothing against Xanadu-the-ideal, as a force driving one’s research, it’s Xanadu-the-software-project that I’m complaining about.
What makes it such a disaster in my view is that even now, decades later, they still haven’t been able to decompose their vision into smaller, manageable, implementable pieces (or so I assume – otherwise they would’ve implemented them a long time ago, right?). This only begs the question: why is this the case? Surely it can’t be because the problem really is a monolithic block that can’t be taken apart?
And yes, I’m working from the stance that computer science research is only valuable when it produces results, whether in the form of artifacts, algorithms, or theorems. Not to sound too snarky, but ideas of unknown viability are a dime a dozen. :) Both Engelbart and Sutherland had prototypes that demonstrated the viability of what they proposed. I’m waiting for one from Nelson before I start taking him seriously.
October 27th, 2004 at 2:38 pm
Rob, it sounds like you’ll be glad to know that the Hypertext Editing System that Nelson and Andy van Dam collaborated on in the 1960s not only functioned, but was sold to NASA by IBM and used to produce the documentation that went up with the Apollo missions.
October 27th, 2004 at 5:20 pm
Indeed, that’s very cool! Thanks!
So do you have any ideas why Xanadu has been in stasis for so long?
October 27th, 2004 at 7:08 pm
Rob, I think it’s because Xanadu is a big project that, since the end of its time at Autodesk, has had 0 full-time people working on its implementation.