November 10, 2004

The Numerist Fallacy

by Noah Wardrip-Fruin · , 12:09 am

I’ve started to get quite annoyed by something I’m thinking of as the “numerist fallacy.” It seems to come up mostly in discussions with humanists and artists who are interested in software but haven’t been involved in much software development.

Its most ridiculous form is the idea that, because digital information is stored as ones and zeros, computers somehow inherently introduce binarism (black and white thinking) into situations where they are used. Luckily, this is somewhat rare. More common is the idea that somehow, if one wants to consider something like the structure of a digital archive deeply — in order to enable more informed critique — one should get down to the numerical nature of the archive and understand how the numbers are being manipulated.

While there might be some interesting work to be done about the assembly code of the device drivers called by the software written for digital archives, I think it will be an even smaller category than the interesting work done about paper and ink manufacturing technologies and their relationship to print archives. From the point of view of empowering critique, it would make no difference if the information were stored as chemical sequences, with each element having five different states (rather than, as we have it now, in magnetic or optical media, with each element having two different states).

If one were seeking to identify the place where a digital archive is structured technologically — the site where power plays out in a manner that leaves textual traces and can be critiqued — wouldn’t one find a better candidate in the neighborhood of software design, engineering, and coding? That’s where decisions are implemented, where power is made operational. Shouldn’t people concentrate on getting deep into understanding programming languages and development practices, if they want to go deep?

Anyway, I’m almost annoyed enough to start writing an essay about this. But surely someone has already written a rant of this sort, and I just haven’t run across it. Can someone out there offer me an appropriate pointer, or even a pointer to something that addresses a portion of this?

50 Responses to “The Numerist Fallacy”


  1. Matt K. Says:

    Noah, I’d like to hear more. Are you saying that hardware, storage considerations are essentially immaterial (so to speak)? A lot of my work has come to rest on the materiality of hardware, particularly storage devices, and the way they impact our experience of new media: consider what the hard drive has done for iPod and Tivo, to give two quick examples.

  2. noah Says:

    Matt, I’ve been reading the excerpts from some of that work on your blog (1, 2) and enjoying them. I also think we’re talking about fundamentally different things. I’m speaking against the urge I’m hearing expressed to “get down to the bits” in order to “really understand what’s going on” in complex technological projects and be able to critique them. In something like the design of a digital archive, what’s going on at the bit level is of pretty limited interest. It’s the same thing that’s going on at the bit level for everything else that uses that hardware, and probably the archive could be moved to different hardware without changing its design. That’s because the design is encoded at a different layer.

    The design is encoded in software that calls libraries that call device drivers that write bits. This higher level is where people need to look if they want to understand what’s going on and be able to critique it. Here the design of the archive is made explicit in language that defines processes. It’s a mistake here that would likely cause a vote archive to operate inappropriately. It’s here where the software defines things like the limits of what kinds of relationships can be expressed in the software, or what kind of assumptions are made about chronology, or other things that humanists and artists might be prepared to critique.

  3. michael Says:

    I too have noticed this fallacy in humanistic writings about computation. Computation has nothing to do with numbers – it’s all about structure and process. The fact that, in digital computers, the structures are ultimately made out of bits, is irrelevant to software studies analyses of structure and process (yes, perhaps there’s some work to do at the device driver level, but that’s about it). The “reduce it all to numbers” perspective also misses one of the most crucial features of computation: abstraction layers. Abstractions seal off the underlying layers (though they leak, which is also interesting): the tower of different abstractions, different structures and processes, how abstractions are established, and the relationship between layers (including abstraction leaks) is where the conceptual action is.

  4. noah Says:

    Michael, I agree entirely. However, we may be letting computer scientists off too easily so far in this discussion. David Durand just pointed out to me that my annoyance may have risen to the surface recently because I’ve been reading some CS stuff that shares a certain numerical focus with the adherents of this fallacy. People who think that theorizing what is computable is the “real computer science” (while human-computer interaction, for example, isn’t) are IMHO perpetrating a numerical fallacy of their own. It’s not the same mistake, but from the outside an insistence that “whether we can get from one number to another” (or, if we’re feeling generous, also “how we can get from one number to another more efficiently”) is the real computer science probably makes it easier for those exposed to this talk to mistake the numbers on the disk as the site of the action.

  5. Jonathan Says:

    “No more than astronomy is about telescopes” does describe a domain of research that is qualitatively different than human-computer interaction, but do people really argue that the latter shouldn’t therefore be done or at least not done in CS departments? I assume from the comment above that they do, but this seems oddly parochial.

    I’d also be curious for published examples of the “bit-level” reasoning mentioned above.

  6. Marie-Laure Says:

    Let me add a rant of my own, which may or may not be directly related to what Noah has in mind. It’s the use of the contrast “digital” vs. “analog” to distinguish, well, “new” and “old” media. I don’t particularly like the terms new and old media either, but I find it less annoying than digital vs. analog. For the “digital vs. analog” people, writing (with a pen on paper) is an analog medium, while writing with a word processor is digital. This completely misses the semiotic similarity of the two types of writing; both are language-based, and while the computer opens new possibilities of artistic expression, there is a cognitive level on which a word written with a pen on paper and a word on a computer screen operate the same way. Plus, writing is black-on-white—doesn’t that make it digital ?? Digital and analog refer to a mode of encoding that may or may not make any difference to the user; for instance, I can’t tell the difference between a digital and analog recording of music, and an image on a computer screen looks analog to me. I guess that’s what Katherine Hayles means with her idea of the “oreo-cookie” structure of computerized information; but if the info ends up looking to the user like it is analog, what’s the theoretical value of the distinction ?
    Incidentally, “numérique” is the standard French translation of “digital,” French theorists commonly talk about “la culture numerique” to refer to computer games and the like. This too annoys me. Lev Manovich seems to follow this usage when he defines one of the distinctive properties of new media as “numerical representation.” Surely he means encoding through zeroes and ones, even though zeroes and ones can be read as many other things as numbers.

  7. noah Says:

    Jonathan, the disdain I’ve heard expressed for HCI doesn’t extend to throwing it out of CS departments (as long as it keeps bringing in funding). But I’ve certainly heard, repeatedly, that HCI needs to get better about quantifying its results if anyone is going to take it seriously as CS. (Which translates to the idea that it’s not CS right now, and needs to be number-centric in order to become CS.)

    As for published examples of the bit-level reasoning I cited, they might be hard to come by. The people I’m talking about would be unlikely to publish — because they’re talking about what they don’t know, what they feel that they want to (ought to) know more about. I’m arguing that they’ve mis-identified the area in which it would be fruitful for them to “dig deeper.”

    On the other hand, I have heard people say that folks like Sadie Plant (who I’ve never read) have published versions of this reasoning. Unfortunately, I can’t offer any opinion as to whether this is an accurate assessment.

  8. noah Says:

    Marie-Laure, I think we are talking about different, but related, things. Finding a name for our field is difficult. And terms like “digital” and “numérique” probably help reinforce the numerist fallacy.

    I wonder what would be better. Should we be talking about “computational” media? That might be helpful, because I’d rather have people ask “in what sense is it computational?” than “in what sense is it digital?”

  9. nick Says:

    I like “computational” better than “digital,” too.

    One thing that might be useful in this discussion to is critique a specific argument, speech, paper, or book, by a specific person, as being based on or promoting this “numerist fallacy.”

    I agree that sometimes the “binary oppositions” of semiotics and structuralism are related to the computer, sometimes in misleading ways, but now that I think about it further, I’m not sure that the comments I’ve heard along these lines are anything more than offhand, misdirected jabs. Are people really basing their fundamental arguments, and basing their humanistic approaches to the computer, on a belief that computers are only capable of “binary opposition” in the structuralist sense? They might be, but I’d want to read some of that first.

    I also don’t really have any idea how the theory of computation would be related to this fallacy, or what the connection is to HCI. I’ve never heard any insistence from anyone that HCI researchers need to study more number theory, any more than I’ve heard insistence that theory and algorithms researchers need to study ethnography. In computer and information science, we have programming languages research, computer architecture, AI and machine learning, natural language processing, databases, robotics, computer graphics, and many other subfields. Faculty are well aware that they use different approaches. Any calls for more quantitative measurement in HCI are, I think, far more likely to come from granting agencies – who want “metrics” that can be used to evaluate whether or not they got their money’s worth – rather than from someone doing computational geometry or something.

  10. andrew Says:

    But I’ve certainly heard, repeatedly, that HCI needs to get better about quantifying its results if anyone is going to take it seriously as CS.

    In interactive drama and believable agent research — an interdisciplinary field comprising CS, drama, psychology, etc., that to my understanding is not yet taken fully seriously as CS — quantifying results has been an issue. When your research goals are intertwined with artistic expression and entertainment, it can be tough to measure progress. (As I’ve only dipped a foot partway into the academic system, it’s not an issue I’ve personally had to deal with much; by contrast, in industry, the only metric that tends to matter is number-of-units-sold, although some think industry can do better than that.) In the context of academic research, I’m sure Michael can speak more to the issues involved in quantifying the results of, for example, interactive drama research, for the purposes of proving that progress is being made, in order to enable future funding, etc.

    (Which translates to the idea that it’s not CS right now, and needs to be number-centric in order to become CS.)

    Are numerical measurements the only meaningful way CS research can be evaluated? For example, there are algorithmic proofs, which I suppose are math-oriented, but not strictly number-oriented?

    Speaking of quantifying HCI-type results, here’s a new book (that I mentioned a few weeks ago) on Evaluating Embodied Conversation Agents — I assume it addresses the numeric-centric issue to some degree. Michael also recently attended a workshop on that theme.

  11. Erik C Says:

    “…in industry, the only metric that tends to matter is number-of-units-sold”
    I respectfully disagree, the industry does also give out awards and keynote speech money (at least I imagine they do) ;)
    Academia doesn’t necessarily value popularity enough, in my opinion. Citeseer and other such resources are in this (memetic) regard, potentially revolutionary.
    I wonder if you all thought of collating a book on your above thoughts, such as on the misunderstandings of computing (and or new media). And perhaps, how to create alternative and improved understandings. I would buy it.

  12. Matt K. Says:

    Going back to Marie-Laure’s comment: one of the first things I tell my students on the first day of class is to throw the idea that digital = computer or high tech out the window. I tell them to look down at their fingers. The essence of the digital is discreteness–we have ten fingers, we have ten discrete appendages. That’s why they’re useful for counting. Then I get out the Scrabble tiles. Then I get out the moveable lead type. A week or two later, when we’re talking about Oulipean language manipulation, they get it. Well, some. ;-)

    The place to go for the semiotic theory here–a book Kari Kraus originally turned me on to–is Nelson Goodman’s Languages of Art.

  13. mark Says:

    Well, like everyone else I agree, but I’ve come across something that strikes me as similar in philosophical writings on computation, only the argument in principle is hard to refute, going roughly as follows.

    Everyone agrees that computers using ones and zeros for encoding is irrelevant, but perhaps the fact that computers must use some inherently discrete representation is not similarly irrelevant. Tying in to the “analog vs. digital” issue, if we posit that the “real world” is inherently continuous, then any discrete representation can only be an approximation. It can be an approximation to arbitrary precision, but if chaotic processes are operating on these approximations, the error can grown arbitrarily large; this also provides a mechanism by which one can argue that even arbitrarily small discretization error can propagate upwards to impact higher-level processes. Therefore, the claim goes, discrete representations are inherently incapable of modeling the real (continuous) world. (This is sometimes used to argue that AI is impossible.)

    I think attempts to extend a line of argument like that into digital/new media are unlikely to be convincing, though…

  14. Erik C Says:

    Two issues/illusions : computers force you to think in binary or they are reductivist.

    closest article I could find

    Why the Digital Computer is Dead

    A nice example of digital vs analogue
    Numbers and symbols
    but to support my pt that the confusion is wider:
    What is New Media?
    Proof that Computers/Internet improve learning
    Or instead of the links you could try ‘On the Internet‘ by the philosophy professor Hubert Dreyfus. At the other extreme from Dreyfus is cyberspace=liquid architecture. Yet people want constraints and genres. We have moved from Lawnmower man to the Matrix for a reason.

  15. Dirk Scheuring Says:

    noah wrote:

    I agree that sometimes the “binary oppositions” of semiotics and structuralism are related to the computer, sometimes in misleading ways, but now that I think about it further, I’m not sure that the comments I’ve heard along these lines are anything more than offhand, misdirected jabs. Are people really basing their fundamental arguments, and basing their humanistic approaches to the computer, on a belief that computers are only capable of “binary opposition” in the structuralist sense? They might be, but I’d want to read some of that first.

    I guess that would be me. Yes, as a writer-turned-programmer-at-age-40, I’ve learned not to underestimate the power of the bit, and to acknowledge the difference between analog and digital, at least when seen from the point of view of the computer itself. I think that at least in my domain – natural language processing -, an artist still needs to know which bits to push where, and why.

    I also am aware of the higher level tools that are available to “support” me with my work, and I’ve used them enough to know how they work, and for what they don’t work. And as of yet, I have not found anything that does the necessary abstraction in a way that I like, or even agree with. This means that I use an ordinary programming editor, a tool that creates ’nuff awareness of the fact that I’m pushing bits, just by making me write out the Boolean-valued variables that are needed. I’m also working at the level where I have to worry about whether my computation halts or not, so here’s the difference between “analog” and “digital” from my POV: in “digital”, your process has to terminate, and it becomes part of the work of the artist to make sure that it does.

    Computers work in a way where you have to find the extremes, the Boolean values, the 1’s and 0’s, – the bits – first, before you can build the up to bytes, with which your able to process Bayesian values: those, then, can open up the infinite universe of values between 0 and 1 for you. If, as an artist, I can find somebody to do (most of) the bit-pushing for me (I’m still pushing bits, but they’re abstracted, e.g. into a radio button in a Photoshop dialog window), then that’s great. The rule of thumb for me personally has become: if I can find the tools to do a job in the way I want, all I have to do is learn to use the tool. If I can’t find a tool I need, and have to build it from scratch, there seems to be no way for me to avoid “going numerical”.

    And since I know how to do that, I also know of the power of the bits, a power which is not only technical, but which can also be a huge political, social, and economic power. If your art is supposed to somehow address this power, it might well be worth to learn about it on the bit level, and doing so could not just be classified as some “fallacy” in such a situation. IMHO.

  16. nick Says:

    Dirk, thanks for the reply. I was actually the one, rather than Noah, who wrote what you quoted.

    Noah’s original point was that “because digital information is stored as ones and zeros, computers somehow inherently introduce binarism (black and white thinking) into situations where they are used.” I don’t think either of us were making any statement about whether artists should use “higher level tools” or not, or saying that it’s a bad idea to know about how computers work. I think we both agree with you that a computational perspective is useful, although Noah can comment on that further if he likes.

    Why do you see the “power of the bit,” and the use of a base 2 number system specifically, as synonymous with “the difference between analog and digital”?

    You mention that you “have to worry about whether [your] computation halts or not.” Do you mean that the halting problem essentially has something to do with bits? It doesn’t seem that way to me. Turing’s “On computable numbers, with an application to the Entscheidungsproblem” was published in 1936, several years before the “invention of bits” in Shannon’s masters thesis, which mapped electronic circuits to Boolean algebra. It doesn’t rely on base 2 arithmetic; unary or base 10 numbers are fine for the proof. Formal language theory considers computability using alphabets with arbitrary numbers of symbols – equivalent to numbers of arbitrary base. So I’m not clear on why bits are special from a theoretical perspective.

    You write that “Computers work in a way where you have to find the extremes, the Boolean values, the 1’s and 0’s, – the bits – first, before you can build the up to bytes, with which your able to process Bayesian values: those, then, can open up the infinite universe of values between 0 and 1 for you.” I’m intrigued by this, but I don’t quite know what you mean. Obviously you don’t mean that I have to flip switches on and off to enter data into my computer. Do you mean that Bayesian reasoning can only be done based on binary observations? And that Bayesian reasoning is the one essential type of computation that computers do?

  17. Dirk Scheuring Says:

    Nick, noah, sorry for the mis-identification. Nick, you asked:

    Why do you see the “power of the bit,” and the use of a base 2 number system specifically, as synonymous with “the difference between analog and digital”?

    For practical reasons. If I make music, and I’m on stage with an accoustic guitar, my performance may fail for various “analog” reasons: broken strings, wrong tuning, lack of dexterity of my digits… If my musical instrument of choice is a computer, “analog” reasons for failure still prevail: broken line connections, faulty power supplys, blown speakers… but on top of those, I aquire a whole host of purely “digital” reasons for failure: like wrong device drivers, incompatible file formats, a missing BIOS update… To succesfully perform using “digital” tools, you have to know how to use the power of the bit, or you risk getting lost. I’ve seen this quite often, too.

    You mention that you “have to worry about whether [your] computation halts or not.” Do you mean that the halting problem essentially has something to do with bits? It doesn’t seem that way to me. Turing’s “On computable numbers, with an application to the Entscheidungsproblem” was published in 1936, several years before the “invention of bits” in Shannon’s masters thesis, which mapped electronic circuits to Boolean algebra. It doesn’t rely on base 2 arithmetic; unary or base 10 numbers are fine for the proof. Formal language theory considers computability using alphabets with arbitrary numbers of symbols – equivalent to numbers of arbitrary base. So I’m not clear on why bits are special from a theoretical perspective.

    They aren’t. That’s because a theoretical Turing Machine is unlimited in ways of storage space and processing time. But since I’m coming in from the implementation end, the time and space I have to work with are definitely finite, so I can only use a subset of the theoretical possibilities of Turing Machines for my practical purposes. Although Turing did not have to find a way to avoid infinite regress, designing an actual Turing Machine implementation that’s testable means using a finite “approximation” of infinity – you can’t implement infinity, in any computer language. You need to have one last bit that says: “If the computation didn’t halt until here, I’ll make it halt, like, right now”. That’s the most powerful one of them all.

    Do you mean that Bayesian reasoning can only be done based on binary observations?

    “Bayesian reasoning” can only implemented in a digital system on a binary foundation. You have to know what “0” means and what “1” means – a bit-, before you can build any system that supports “pseudo-analog” values like 0.3418735, which might be the value of some weight in a Bayesian reasoning system.

  18. noah Says:

    Dirk, I suspect we’re talking at cross-purposes here. You’re not saying that the particular way that bits get written to the particular hard drive you’re using matters to you as an artist or developer, are you? Rather, you’re saying that it’s important that you understand how different types of numbers function in your computational environment, right?

    I’m saying that this could be simulated on a vastly different storage medium and it wouldn’t make a difference to you as long as your programs were able to read/write the information at about the same speed. The logic of how you’re manipulating numbers isn’t something you author down “close” to the hard drive. You’re authoring that logic at a higher level, using a language that makes certain types of numbers available to you. Up at that higher level you structure the logic of your software.

    Now, it’s probably true that people have to understand something about the types of numbers available in your computational environment in order to understand the structure you’ve authored, and they have to understand the language in which you’ve authored your structures. So, when humanists or artists want to “get deep” into technical projects, they should start learning more about things like the language in which you’ve authored your structure and the types of numbers you’re using. Neither of these involves getting down to how the bits are moving around. Neither you nor they would know, or care, if your device drivers were suddenly and silently switched out for a new set that arranged the bits in a completely different order – as long as performance didn’t degrade too much.

    Matt K, can you tell us more about Languages of Art? Do you use it in your teaching or cite it in your writing?

    Erik C, thanks for the links, and the book encouragement. One thing Nick and I wrote, Acid-Free Bits, attempts to explain certain computational media choices to writers, artists, and scholars in a way that may serve to counter some misunderstandings. It doesn’t, however, specifically take aim at many misconceptions. I wonder if I would just get too grumpy, writing such a thing. I guess I’ll know more in a bit, if I end up feeling I need to write this “numerist fallacy” essay.

    Speaking of links, a short web search for Sadie Plant brought this one:
    http://www.t0.or.at/sadie/binary.htm
    At the other end of which I found this passage:

    Messengers, messages, and the points between which they circulate are coded in the 0 and 1 of binary maths, an off/on switch which is, as Baudrillard writes, “no longer a distinctive opposition or established difference. It is a ‘bit’, the smallest unit of electronic impulse – no longer a unit of meaning […] This is what the matrix of information and communication is like, and how the networks function.”

    Although there is a sense in which the stark reductionism of binary code reinforces the binaries of the modern sexual economy, it also has quite contrary effects. The introduction of binary code introduces a plane of equivalence which undermines the very foundations of a world in which male and female have played the roles of superstructure and material base. Go-betweens become more important than that which they go between; communications systems gain a life of their own; networks and machines learn to turn themselves on.

    While I still think most who feel that “going deep” means “learning about binary data storage” aren’t publishing these thoughts (who publishes “I wish I knew more about x”?) it may be that writings like Plant’s have helped plant the seeds of this desire. We may need alternative publications to help people understand where their learning might be more fruitfully directed.

  19. Dirk Scheuring Says:

    noah asked

    Dirk, I suspect we’re talking at cross-purposes here. You’re not saying that the particular way that bits get written to the particular hard drive you’re using matters to you as an artist or developer, are you?

    They might very well matter. For instance, even talking a basic tool such as an editor, the convention for encoding line breaks in plain text files are different under Windows from how this matter is handled under Linux. I do have seen digital artworks that failed to run because the artist hadn’t known this. It’s a small thing, but it can break your whole app.

    And behind this little detail lurks the global rift that oftentimes gets represented by the Windows/Linux pair of concepts. A certain way of writing a bit to a hard disk might be patented, so I’m not allowed to use it in my artwork – better if I know of another way to do it that is license-free. I understand how you consider yourself isolated from these peculiarities by the software architecture that you’re given, and that may work for many purposes, but when we do this, we should be aware that we’re still relying on the results of somebody elses view on “reality” – and that might mean not only the technical, but also the political, social and economic values and preferences that this “somebody else” represents.

    So, when humanists or artists want to “get deep” into technical projects, they should start learning more about things like the language in which you’ve authored your structure and the types of numbers you’re using. Neither of these involves getting down to how the bits are moving around.

    Okay, but if they start out knowing how bits move around in general, they might be faster at understanding what happens in a new program they get to read/critique. Some people might feel that, if they happen to read/critique different programs on the regular, and have invested in understanding programming on the bit level, they actually save lots of time in the long run. YMMV, though.

  20. noah Says:

    Dirk, I’m still convinced that we aren’t disagreeing, but merely miscommunicating:

    For instance, even talking a basic tool such as an editor, the convention for encoding line breaks in plain text files are different under Windows from how this matter is handled under Linux. I do have seen digital artworks that failed to run because the artist hadn’t known this. It’s a small thing, but it can break your whole app.

    Yes, but this isn’t an example of “moving bits around” in the sense in which I’m speaking. That text file could be written to a magnetic hard drive, an optical CD-ROM, or an experimental biological storage system and (given appropriate device drivers) you’d still get the same error when moving between operating systems. Same even with something like big-endian notation. These are logical constructs that get recorded as binary numbers — but they could be recorded in other ways, and the level at which they’re most legible (which is also the level at which they are constructed) is not the bit level. It’s the code level.

  21. hanna Says:

    Sorry to leap in here, but as a researcher who concentrates on Bayesian reasonaing and inference, I’m bothered by Dirk’s assertion:

    Computers work in a way where you have to find the extremes, the Boolean values, the 1’s and 0’s, – the bits – first, before you can build the up to bytes, with which your able to process Bayesian values: those, then, can open up the infinite universe of values between 0 and 1 for you.

    I’m well aware that this will be seen by most as merely nit-picking, but I have to disagree with your use of the term Bayesian values. You seem to be using the term “Bayesian values” to refer to real numbers in general, which really, are nothing to do with probability. True, the ability to represent real numbers necessarily implies that representation of values between 0 and 1 is possible, however, this is something useful for probability in general, and nothing to do with treating probabilities from a Bayesian viewpoint.

    “Bayesian reasoning” can only implemented in a digital system on a binary foundation. You have to know what “0” means and what “1” means – a bit-, before you can build any system that supports “pseudo-analog” values like 0.3418735, which might be the value of some weight in a Bayesian reasoning system.

    Again, I’m bothered by the introduction of Bayesianism into this discussion; these comments no less pertinent to any other set of operations involving real numbers.

    Finally, I do agree that one must be aware of the finite nature of the representations used for real numbers on computer systems, and furthermore, that a knowledge of how floating point numbers are represented using binary digits, however, I do not think this is inherently problematic for anyone who has read the IEEE 754 standard.

  22. nick Says:

    Dirk, thanks for your reply. I am not completely following your discussion of Turing Machines, but one thing I read there and throughout is what I commented on in the first place. You have characterized all discrete math as being binary, saying that implementations of Turing Machines need to have “one last bit” and that “‘Bayesian reasoning’ can only implemented in a digital system on a binary foundation.” While you need to be able to enumerate, that is, count, in order to do computation in the usual sense – and this is the case whether you carry it out by hand or by electronic or mechanical means – it doesn’t matter whether you use base 2, base 8, base 10, base 16 or something else. So I still don’t see how a fundamental concern with extremes and 1s and 0s has a basis in an understanding of computation.

  23. Matt K. Says:

    Languages of Art:

    Goodman’s an analytic philopsher, and LoA is hardcore. The best known terms he works with are “allographic” and “autographic.” For Goodman, an autographic work is one whose ontology *defies* reproduction, while an allographic work is one whose ontoogy is *fulfilled* in reproduction. Thus, a painting is autographic, a novel is allographic. This dovetails with a familiar question in textual criticism: if the Mona Lisa is in the Louvre, then where is the text of Hamlet? The point is that we make a commonplace/commonsensical distinction between original and reproduction in the case of the Mona Lisa, but we don’t feel we have to go to the British Library and inspect an original Quarto (or recover the Bard’s lost manuscripts) to have a valid experience of “Hamlet.” Closely associated with this is Goodman’s principle of “sameness of spelling.” An allopgraphic work is successfully reproduced if it has the same spelling as its predecessor. Take War and Peace. If you imagine it as one long string of characters, then any identitical string (same spelling) is a valid instance of the work. Here’s where we see the importance of the digital and the discrete: there is no corresponding way in which to break down the “text” of the Mona Lisa. Now, if we digitize the painting, then it becomes subject to formal manipulation with a tool like Photoshop, and lossless transmission of the image: but precisely because the work is now rendered in an allographic, rather than autographic, state.

    All of this, of course, elides the kind of argument about materiality that you might expect from a Johanna Drucker or a Kate Kayles or a Jerome McGann or even a Matt Kirschenbaum. Or a Walter Benjamin. I employ Goodman’s ideas strategically, not as the final word on the nature of digital, but as one illustration of why distinctions between digital and analog are worth preserving. The *materiality* of digital media has to rest on its propensity for allographic behavior.

    I suspect Kari will have some more to say.

  24. Matt K. Says:

    PS: I’ve read Goodman with my graduate classes. I offer a summary of these ideas to my undergrads.

  25. Marie-Laure Says:

    To return to the stance that irritates Noah (and me too): does the use of computers force you into “digital,” black and white thinking. What I’d like to ask is this: isn’t thinking basically binary ? The neurons of the brains are bipolar, either “on” or “off.” Thinking is a process of binary decision-making, and nuanced opinions (like: there are no “good guys and bad guys, only moderately good and bad guys”) are just a matter of finer granularity. Like those electoral maps: everybody cast a blue or red vote (except for the Nader voters, which are not accounted for), and the purple areas are a blend of red and blue.

    So my question is: is analog thinking possible at all?

  26. Dirk Scheuring Says:

    hanna,

    I used the juxtaposition of Boolean/Bayesian values only because it was the first illustration that came to my mind (both types are fundamentally relevant to my current work), not because I think that “Bayesian values” are the same as real numbers in general. It was just a quick example (with a tempting alliteration to boot ;-) to show something you cannot build using digital computers (okay, I’ll specify this – not just any possible digital computer, but the kind that most people seem to use today) if you don’t understand how to represent it using binaries. No generalizations in terms of “Bayesianism” (cute word, that) were intended. Sorry for upsetting you.

    And noah, you said:

    You have characterized all discrete math as being binary, saying that implementations of Turing Machines need to have “one last bit” and that “‘Bayesian reasoning’ can only implemented in a digital system on a binary foundation.”

    Nah… can’t remember characterizing “all discrete math” as anything; sorry to be misunderstood: I was just talking about programming computer systems that are in general use today. I’m serious about that “one last bit” bit, though; it’s a real drag. What is true is that it’s not a problem limited to binary systems; it just so happens that binary systems, due to their commonalty, are where programmers encounter the problem most often today (so often, in fact, that I can’t recall anybody ever telling me that s/he encountered it in a different context). But strictly speaking, it has to do with undecidability, recursion, and infinity, and not with the number base you choose… We digress, though.

    What got me thinking was your original post about the alleged “numerical fallacy”, followed by you saying that

    I’m arguing that they’ve mis-identified the area in which it would be fruitful for them to “dig deeper.”

    Just four years ago, this whole discussion would have meant nothing to me; for about 15 years, I used computer programs to create art for different media, content in not digging any deeper.

    A year later, the situation had changed: I had hit a snag, found that I lacked a certain tool, and tried to figure out what it would take me to develop it myself. Had anybody told me that it would not be “fruitful” to dig any deeper, I might have taken this advice, and probably would have waited forever for this tool to magically arrive one fine day. As it were, though, I let my ignorance take the lead, almost blindly vacuuming up information about the inner workings of computer systems and languages for a while (since I didn’t know what I wouldn’t need to know), making a seemingly endless series of mistakes in trying to make sense of it all, but finally winding up with exactly the kind of device that I was looking for at the outset. And in hindsight, even the “frantic” bits of “thrashing around” in order to find a solution make perfect sense. So, with this experience in the background, countering an artist’s inquiries about the bit-level stuff in our programs with something along the lines of “I never needed to know this, so you’ll never need to know this, either” (which was how I understood your post) seemed inappropriate to me – after all, I knew that digging deeper sure can have its gains.

  27. Matt K. Says:

    Nelson G.: well that sure shut down the conversation. ;-)

  28. nick Says:

    Thanks for the comment, Dirk. I have to mention that the first text you just quoted was written by me, not Noah. The second text you quoted was something Noah wrote in answer to a question I posted on the thread. Noah and I do collaborate often, but we’ve also appeared in person together several times to try to lay to rest the idea that we’re the same person.

    I very much agree that artists and writers who use computers as an essential part of their practice, and computer creatives of other sorts, should learn more about computers and computation when they’re motivated to do so, as I’ve tried to do myself.

  29. noah Says:

    Marie-Laure, I guess my thought is that there may be something binary we could find “at the root” of any process that seems analog. So, for example, we could say that drawing with a pencil is really binary. It’s a process of making the smallest, lightest possible mark, or not, over and over — until there is a picture. (Luckily, this is sped up by the fact that the pencil lets us make a whole lot of marks all at once.) But I’m not sure if we gain anything from this approach. By which I guess I mean to say, some things are more binary than others. Potato stamps have lots of surface irregularities (so each time you stamp is different), but the choice of whether to stamp again or not seems binary.

    Matt K, thanks for the further info on Goodman’s LoA. Now that you say more I know this book has come up in a relatively recent conversation — I think at UCSB’s “Digital Retroaction” symposium. It seems like a good way to help students understand what it is that makes digital files reproducible. But I wonder about something like audio tapes, which seemed fulfilled by reproduction (or selective reproduction onto mix tapes) when I was in college — but where the reproductions were far from “lossless” (and therefore didn’t make it over the bar of “sameness of spelling”). Am I missing something?

  30. Dirk Scheuring Says:

    BBC – Binary Bayesian Craziness! Dig the diagnosis! Get down with my sickness!

  31. Aaron Says:

    Author: Here’s my manuscript.

    Publisher: Wow! Is that Xerox paper?

    (Gasoline provider, meat provider, content provider)

  32. Aaron Says:

    Ah, here’s the link I was looking for.

    Miscellaneous comments:

    One way in which discussion of Bayesian theory is relevant: I’m often annoyed by people reducing logic and science to making statements that must be true or false; this seems to be a staple for science fiction writers. Bayesian theory provides a principled logic for uncertainty.

    In the CS department where I work, there is definitely a culture gap which makes it difficult for the more theoretical faculty to support hiring systems or HCI researchers that aren’t doing a lot of deep math because they usually don’t view their work as “fundamental.”

    Although neurons in the brain are binary in one sense (either spiking or non-spiking), the timings and rate of spiking are not binary, and continuous values and continuous levels of probability can be represented by timing and rate of firing.

  33. Dirk Scheuring Says:

    Right on, Aaron.

  34. Aaron Says:

    One other comment: if CS types are criticizing your work, then perhaps you haven’t made a sufficiently-successful effort to explain to outsiders the value of what you do. It is very easy for most of us to stay within our own communities where the value of the work is more-or-less understood (even as people struggle to define what the field is), but it is dangerous for a community to become too parochial.

    Or maybe it’s them.

  35. Aaron Says:

    More reducto ad absurdum:

    Because the Roman alphabet only contains 26 letters, the English language can only represent 26 concepts.

  36. noah Says:

    Aaron, thanks — I like some of those reductions!

    Now, perhaps I should start a new thread to ask this, but what else can people suggest in Matt’s vein? That is, what else can people suggest that they think is helpful for people trying to understand these issues who come from arts and humanities backgrounds?

    I can start off by saying that someone recently introduced me to Guy Steele’s “How to Grow a Language.” I think it’s a great piece — it’s like an Oulipian writing exercise following a constraint derived from thinking about programming languages, and it’s also a good introduction to (and argument about) some of those issues.

    Of course, there are also the connections from our procedural literacy discussion and Matt’s pedagogy of programming page.

    Other suggestions?

  37. Dirk Scheuring Says:

    There’s an online C++ tutorial that can be understood by people who never programmed before, covering basic questions like “How do programming languages work, anyway?” and “What is a variable?”

    An alternative is to start with Scheme, probably by reading “How to Design Programs”, the textbook to PLT Scheme .

    Then there’s “Instant Hacking”, an introduction to Python for written for non-(or not-yet)-programmers that’s part of the larger Python for Non-Programmers project.

    Yet another way to start would be by learning Squeak, which uses Smalltalk to create a programming environment for beginners.

    Before this list grows too long, and too rational, I’ll throw in a text that’s more of a crazy suggestion as reading material for programming novices: Alan Perlis’ “Epigrams on Programming”. At first reading, I understood next to nothing, but just found statements like “Syntactic sugar causes cancer of the semi-colons” aestethically pleasing. I returned to them later on, knowing more, and found that not only did I understand more of them with each visit, but they also sharpened my focus on quite a lot of concepts that other texts only gave me some vague idea about.

    Finally, here’s a link to Guy Steele’s “How to Grow A Language” paper that noah mentioned…

  38. Dirk Scheuring Says:

    “How to Grow A Language” didn’t work. Hope it does now.

  39. nick Says:

    I just (hopefully) fixed the links to Steele’s paper.

  40. Dirk Scheuring Says:

    This one might be of interest, too, not only for artists inching towards programming, but also for programmers inching towards art: “The Poetry of Programming”, an interview with the computer scientist/poet Richard Gabriel, in which he proposes a program that offers a Master of Fine Arts in software development. Fav quote: “When I’m writing poetry, it feels like the center of my thinking is in a particular place, and when I’m writing code the center of my thinking feels in the same kind of place.”

  41. Dirk Scheuring Says:

    Noah, I just remembered a concrete example of “applied craziness” which might illustrate how figuring out a low-level software protocol that had no obvious connection to my art helped me in building it. About three years ago, while still knowing very little about computer languages, I was looking for ways to expand AIML – a small language based on a simple stimulus-respose model – in order to be able to use it to do more complex computations. In search for inspiration, I actually ended up working my way through the CORBA spec. Why? Because I wanted to know what a “three-tiered architecture” was, and how it worked. I remember next to nothing about CORBA, and probably wouldn’t ever have remembered the incident itself, if we wouldn’t have had this here discussion. But the example of CORBA gave me the necessary idea – put some sort of “middleware” between “stimulus” and “response”, like CORBA is “middleware” between “client” and “server” -, and from there, I could proceed in building what I wanted.

    Sure, in hindsight, I must acknowledge that there are more elegant ways to learn what I learned there – it’s just that those ways were not visible from the trajectory that I was coming in on. And it worked. And it’s there where my belief that low-level computer knowledge – as ideas which are portable between otherwise unrelated applications – can be useful for artists is grounded.

  42. noah Says:

    Dirk, I’m not arguing that low-level computer knowledge can’t be useful for artists and scholars. Much the opposite!

    I’m arguing that knowledge about how things get made is more useful than knowledge about how they are stored.

  43. Dirk Scheuring Says:

    Yes, and that’s exactly where I don’t understand you: how can I make these things if I don’t know how they are stored?

  44. Jonathan Beyrak Lev Says:

    Hello! I’m new around here so forgive me if I say anything overly ignorant or otherwise inappropriate.

    That notion of the Numerist Fallacy is intriguing. What with all this accursed post-modernism (sorry if my bias is showing) running around the academic world, not to mention the artistic world, it’s now commonly accepted for every art form that isn’t delivered through a computer that the text doesn’t actually EXIST anywhere except inside the reader’s/viewere’s/listener’s/user’s/whoever’s mind.

    It’s very strange that those same people who have no problem with the idea that a novel isn’t a pile of ink-stained paper and a film isn’t a bunch of celluloid would find it difficult to understand that a program isn’t a great heap of 0’s and 1’s stored away on some hard drive. Could it be that of all areas of human thinking in modern times, the most important one – the computer – is the least touched by post-modernism?

    I hope this digression is within reason, but I also think that a commonly misunderstood notion is that narrative isn’t a bunch of casual dramatic events. Like any text, narrative only exists inside the audience’s mind, and it is our experience of it that makes it what it is. Chris Crawford’s work, for example, seems, in theory, to be excellent at recreating narrative “externally”, that is, in generating from algorithms that respond to user input a narrative which might be acceptable as that of a story or play. However, the user might not experience it as such, for example, because he might be seeing himself as an actor in a play, which immediately eliminates his ability to experience it as a narrative. I haven’t yet read anything addressing that issue.

  45. noah Says:

    Jonathan, your worry about the user seeing herself as an actor in a play reminds me a bit of Mark Bernstein’s “My Friend Hamlet” argument, which you might find interesting. As for your worries about postmodernism, I can’t help you.

    Meanwhile, related to the rest of the conversation, I thought I’d drop a link to the Dictionary of Algorithms and Data Structures.

  46. Jonathan Beyrak Lev Says:

    Noah, I would be thankful if you could share with me why my question reminds you of the “My Friend Hamlet” argument. I think I’m missing the connection.

    < <>>
    BTW, my only worry about post-modernism is that it exists :)

  47. noah Says:

    Jonathan, let’s move this to a more appropriate thread.

  48. Dirk Scheuring Says:

    Numerist fallacy? Design by numbers.

    Via Matt Kirschenbaum.

  49. noah Says:

    Um, Dirk, I don’t think DBN is related to how things are stored on disk (with binary math or not). In fact, I think it’s an even greater level of abstraction from that level.

    Wait, I’m probably missing the humor here. (These are the problems that led to the rise of smiley faces in network communication.)

  50. Dirk Scheuring Says:

    ;->

Powered by WordPress