Of Computing.
In continuing to put off part two of my DFW essay, I offer some thoughts on computers & computing. Follow links for footnotes.
***
1. What Is a Computer For?
The first computer I ever owned was a little black-and-white Macintosh that my mother brought home from some far-off electronics store in the suburbs. It was about the size of a fish tank set on its side,[1] and had a screen not much larger than a grown man’s hand. I was thirteen. I took it into my bedroom and set it up on a spare dining chair that had been mouldering in the darkest corners of the basement. Then, for a few weeks, it sat, unused: what is a computer for? I had learned to type on an electric typewriter as a child. I had played rudimentary games on an Apple IIe at school. They had keyboards and screens. I remember wishing I could attach an antenna to it and watch TV.
Eventually I discovered a few games hidden deep in its files, simple things like Cannon Fodder, which involved an extremely abstract simulacrum of wartime artillery — no more than blips floating across the screen, a game that boiled down to an intuitive sense of geometry. I procured Claris Works, that long-ago ancestor of Pages, the word processor I’m using now, some nineteen years later, to write this. That’s what a computer was for: it was a video game system and a typewriter that you could keep on a wood-backed chair next to your bed. I never crunched a number with it, and certainly never communicated with a human being beyond the bounds of my bedroom using it, except inasmuch as I might have written and then printed out a letter that I later put in the mail. Its chief bonus over a typewriter was that I didn’t have to keep track of reams of paper if I wrote a story on it.
The Online Etymology Dictionary entry for “computer” reads as follows:
computer (n.)
1640s, “one who calculates,” agent noun form compute. Meaning “calculating machine” (of any type) is from 1897; in modern use, “programmable digital electronic computer” (1945; theoretical from 1937, as Turing machine). ENIAC (1946) is usually considered the first.
There’s more, but that’s the relevant bit. ENIAC is a hyperlink which leads to a curious entry in the Online Etymology Dictionary (“acronym from ‘electronic numeral integrator and computer,’ device built 1946 at the University of Pennsylvania …” — it contains no actual etymological data of value, and beyond that defines a word not in use as anything other than a proper noun for a single instantiation of a device). At the bottom of the entry for “computer” is the following citation: “WASHINGTON (AP) — A New York Congressman says the use of computers to record personal data on individuals, such as their credit background, ‘is just frightening to me.’ [news article, March 17, 1968]”.
So what is a computer for? When it was a human being, it was a person who did mathematical operations. The initial serious gestures at taking this activity out of fallible human hands were not actually given the same name; Charles Babbage, who labored at many curious pursuits, designed a theoretical “Difference Engine” whose purpose was to produce accurate arithmetical tables to coordinate trains — but before long, in the mechanized, optimistic Victorian age, the device began to seem real enough that it no longer wanted a strange, capitalized name, and it became the same mundane thing a man with a slate and an abbacus was. The first successful programmable and fully automated computers were built, not coincidentally, in the early 1940s — Germany had the Z3, the British Alan Turing’s Bombe, the Americans were hard at work on ENIAC. These computers did math, usually in the service of making or breaking military codes. They computed, quite literally.
In order to do this, they had to have memory — to perform the several steps of a complicated logarithm, a computer had to be able to “remember”, so to speak, the steps that had come before and the the ones that were to come. By 1968, the memory function had become at least as important as the mathematical one — and, just as computers had been weaponized as targeting systems, code makers, and code breakers, the memory now became something weird and frightening, a threat to privacy. The nature of a computer was changing, growing: memory was more than math.
By 1993, when my parents gave me a (probably used) Macintosh Classic, memory had grown cheap, and our credit histories were stored on computers all right — but memory had been miniaturized, and brought into the home, and again the nature of the computer was changing: the memory machine had become a toy and a tool; it was not just a storage device, but something that interacted with a user, and had graphical outputs that made much more than math possible. Underlying this somewhere was still a programmable electronic machine that operated on a mathematical principle, but my little Macintosh had very little to do with Babbage’s Difference Engine, in size, shape, purpose, or use.
2. Of Copying.
Charles Babbage was a wealthy polymath, the Lucasian Professor of Mathematics at Cambridge, a train enthusiast, a compulsive cataloguer and taxonomist,[2] and probably the smartest man in the western world for a while in the 19th century. He was also, as noted above, an early advocate of the idea that automated mathematical calculations could remove human error in arithmetical tables, thereby increasing both safety and productivity for train conductors, among other things. Stempunk novels, including one by William Gison called The Difference Engine, often posit an alternate history in which Babbage’s Difference Engine, which proved to be too cumbersome and pricey for Victorian manufacture, was successfully built, spurring the kind of explosion in computing that occurred after Turing and others built their code machines.
Mathematics was not the only place where Babbage saw the utility of automation.[3] One of the places that his bizarre compulsion for cataloguing manifested itself was in the abstruse 1832 tome On the Economy of Machinery and Manufacture. In the chapter “Of Copying”, which is about “COPYING, taken in its most extensive sense” [viz., inclusive of things like bullets, bedframes and boxes], he delineates a theory of labor that fits nicely in the path that runs from Adam Smith to Karl Marx. He notices that, in such copying work, “the instrument or tool actually producing the work, shall cost five or even ten thousand times the price of each individual specimen of its power”, and seems to be trying to answer the question: how could this possibly be a workable economic model? How can we pay for an expensive machine by making things that we sell cheaply?
The answer is twofold: efficiency, and time. Both are encapsulated in a somewhat throwaway line at the end of his description of the printing of calico “by Cylinders”: “A piece of calico twenty-eight yards in length rolls through this press and is printed in four or five minutes”. Efficiency: a great deal of material can be processesed, and so can be sold cheaply because sold in bulk, and thereby still pay for itself. Time: it can be done in four or five minutes, leaving a laborer with time to make more calico — or to go do something else, something inventive and important. (Or, no doubt, to cause a street nuisance.) Whereas many previous theories of labor operated on the assumption that there was a fininte amount of work available to be done in the world, most increase in wealth coming from land and husbandry, Babbage imagined a universe in which economies expanded to meet the ever-multiplying abilities of their laborers and inventors. This has its antecedents in Smith’s The Wealth of Nations, in which the great dour man of the dismal science noticed, among other things, “[t]hat the industry which is carried on in towns is, every where in Europe, more advantageous than that which is carried on in the country”.
I mention Smith here because he spends a fair amount of time in The Wealth of Nations (published in 1776) arguing in favor of a public education system, to combat an ill he sees as resultant from the very same manufacture that Babbage would, forty-odd years later, so exhaustively catalogue:
In the progress of the division of labour, the employment of the far greater part of those who live by labour … comes to be confined to a few very simple operations… . The man whose whole life is spent in performing a few simple operations, of which the effects too are, perhaps, always the same … has no occasion to exert his understanding, or to exercise his invention.
The efficiencies of manufacture make city work dull, and this results in a dull populace. The time afforded by manufacture, however, allows for the education of the masses. Babbage begs Smith’s question in “Of Copying”.
Babbage spends a fair portion of “Of Copying” considering the kind of copying that feels familiar to a modern reader: the copying and distribution of text. This is, I think, not a coincidence: the copying of texts (and arithmetical tables) was an information technology, and a tool for education; Guttenberg, though he went broke, invented mass communication on a scale before incomprehensible, by figuring out how to copy texts quickly. The efficiencies of manufacture, public education for laborers, the incipient literacy of the working classes: information and its distribution, in the age of Babbage, were becoming hugely important.
Information and its distribution. What would a modern Babbage, in writing his “Of Copying”, come to say about the copying of text? Because something weird has happened to copying: the vanguard in Babbage’s day — movable type — has become declassé. The very fact that it makes many, many copies, copies which take up space, and can be lost, and cost money to make and maintain, and take time to deliver,[4] is now seen as a drawback. Instead, a decendant of Babbage’s theoretical Difference Engine has made it so that, in fact, there need only be one copy of many texts, one copy that can be accessed, in theory, infinitely, by infinitely many people with computers: in order for “Of Copying” to be available in public education, one need not print it out thousands upon thousands of times, but simply copy it once into Google Books. The efficiencies have been turned on their heads.
3. ARPANET: Mind as Internet.
Not long after that New York congressman quoted in the Online Etymology Dictionary entry for “computer” expressed his dismay at the thought of computers as giant, intrusive memory machines, Science and Technology Magazine published an article by J.C.R. Licklider and Robert W. Taylor entitled, “The Computer as a Communication Device”. It begins with a bold assertion: “In a few years, men will be able to communicate more effectively through a machine than face to face”. They then go on to detail a meeting “held through a computer”, in which “the group accomplished with the aid of a computer what normally might have taken a week.” No matter that they were all in the same room; they might well have been on opposite sides of the planet.
Taylor was, at the time, director of the computer research program at the U.S. Defense Department’s Advanced Research Projects Agency [ARPA]. Licklider was a psychiatrist and electrical engineer and Taylor’s boss, the head of ARPA’s Behavioral Sciences and Command and Control (read: real-time decision-making) departments. The team that Licklider assembled would be nicknamed the Intergalactic Computer Group, dedicated largely to to what they called “communications engineering”, and to a radical new idea of what communication meant, and what it did:
We believe that communicators have to do something nontrivial with the information they send and receive. And we believe that we are entering a technological age in which we will be able to interact with the richness of living information — not merely in the passive way that we have become accustomed to using books and libraries, but as active participants in an ongoing process, bringing something to it through our interaction with it, and not simply receiving something from it by our connection to it.
“The richness of living information”: they believed that computers were not memory banks, but minds, or at the very least a physical extension of human minds. Though the idea of information as being alive and powerful has now become conventional, at the time it may have been difficult to conceptualize: information came, and would continue to come for most people and for some time, in an inert form, as ink on a page, with all the connotations of permanence and authority that that form still holds.
They were talking about interactive computing, and they were inventing ARPAnet, the precursor to the internet that went live in the early 1970s. Licklider would be gone by the time the switch was thrown, working at Xerox, which invented the graphic interface appropriated by Steve Jobs and now used by nearly every computer on the face of the planet. But he had begun yet another quantum shift in what a “computer” was.
What’s a computer for? In 1968, in the popular imagination, it was a giant machine that calculated telemetry, plotted routes for nuclear missles, and maybe — just maybe — collected information of a sort that we weren’t totally sure we wanted collected all in one place. The popular imagination, as it always has in this area, lagged: computers had been memory machines for a long time. And it isn’t that they ceased to be memory machines the day Licklider and Taylor had their staff meeting in a room with a bunch of networked computers, collaborating to make sure that they were thinking similar things — what they call “modelling” in their paper. But they became something else, too. When people could remotely access that memory, change it and leave it different, computers suddenly became wildly more powerful: they not only stored our thoughts for us, but projected them outward into the world, opened them up for discussion. The possibilities were endless.
4. The Information.
In the 1992 film Sneakers, Robert Redford and Ben Kingsley play aging former hackers, men who were in on the ground floor of those networks that Taylor and Licklider were writing about and inventing. Twenty years on, they have discovered themselves in conflict. In the climactic scene, Kingsley’s villainous character, Cosmo, delivers a thundering bad-guy-speech: “There’s a war out there, old friend. A world war. And it’s not about who’s got the most bullets. It’s about who controls the information. What we see and hear, how we work, what we think — it’s all about the information!”
This film came out about a year before I got my first computer, and though I loved it dearly — I must have seen it seven or eight times in my teenage years — I had no way of conceptualizing what that climactic speech was about. Who controls the information? Information is controllable? Even after I got a computer, it didn’t make a hell of a lot of sense to me. That thing didn’t have any information on it. It had a game called Cannon Fodder. Eventually it had a bunch of half-finished short stories. I wasn’t sure any of that qualified as information.[5] And how could you have a war over any of it?
Years later, as a sophomore in college, I first plugged a computer into a fast internet line, and I understood. At Pomona College in the late 1990s, there was an excess of access to what we then called the Information Superhighway:[6] we had a T3 LAN line, enough bandwidth for a huge company in the business of web hosting, dedicated to 1500 undergraduates with nothing better to do than download music and send IMs. There wasn’t that much to do on the internet in those days: there was no YouTube, no Facebook, and yet there was so much — and without Google, it all seemed undifferentiated, hard to find, difficult to access. How did we hear about new websites in those days, without Reddit and Twitter? I honestly cannot remember.
But still. The deluge: within weeks I had found a baseball chat board, where I learned a great deal — and more than that, probably above and beyond anything that Licklider or Taylor could have imagined back when they were operating in obscurity (if not secret), I made friends and enemies. There were strange outcroppings of internetual rock, dissertations on the subjects of role playing games I had never heard of, fake people telling obvious lies for reasons unknown, libertarians living in socialist Scandinavia, and more pictures of naked women then I ever could have imagined before I plugged in. James Gleick writes in his book The Information that many people considered early telegraphs — done visually, without electricity, but literally with smoke and mirrors — “a nervous system for the earth”. It seems astouding to say so: controlled and funded by Napoleon, the telegraph network was not “an instrument of knowledge or of riches, but … an instrument of power”. Is not the internet, and all its appurtanences, a much better model of a nervous system — an instrument of knowledge and wealth and power and … well, one hesitates to call videos of cats falling into bathtubs any of those things, but I suppose it is information of a variety.
What is a computer for? What does it do? I am not entirely convinced that the modern computer is anywhere near enough like its ancestors to be considered the same thing — the laptop I’m writing on now has little other than a binary system of storing information in common with Alan Turing’s Bombe, which did only math and occupied an entire room. Though Turing was always at pains to assert that the inherent essence of computing was not in the electricity but in the logic, it does seem that on some fundamental level things changed — possibly more than once — in the interim.
Turing’s Bombe and the Z3 and the ENIAC did math. They were programmable — they ran according to a system of rules and gave consistent results[7] — but what they did, in essence, was compute logarithms and probabilities. They had memory, but it lived in punch cards and vacuum tubes, easily lost, fundamentally no better as a system of organizing information than typing it onto a sheet, paper cuts, ink-stained fingers and all.
The computers that provoked anxiety in the congressman from New York, they did math, too — but they were memory machines, able to store and organize information in a way that was fundamentally different to how it was done in a file cabinet: this memory was searchable, accessible, and though it still had to go on disks it no longer involved a paper cut. Perhaps that’s what troubled the congressman from New York: not that someone might someday know his credit score, or those of his constituents; but that it would be stored in an automaton, and easily found — not forgotten. Such a process increases the alienation in the world. It renders human judgement increasingly moot.
The little Macintosh with the Cannon Fodder? One begins to grow so far removed from a giant, overheating machine designed to break codes that one begins to wonder if we’re really still dealing with the same beast as before. There are some principles in common, but no purposes that I can discern. Once we turn on the network — once the computers are talking to each other, and the people using them talking through them, then, I believe, we have reached a point at which it is not really possible to assert with a straight face that what we have on our hands is really the same thing at all. I suppose it comes down to a philosophical question: what defines a thing, the rules by which it was built, or the use to which it is put? Because my computer does not do math. I use it to manipulate the rich store of living information, and to create more.