random characters on the screen—known to cognoscenti as “going Cyrillic.” But to the MacOS, the screen was not a teletype but a place to put graphics; the image on the screen was a bitmap, a literal rendering of the contents of a particular portion of the computer’s memory. When the computer crashed and wrote gibberish into the bitmap, the result was something that looked vaguely like static on a broken television set—a “snow crash.”
And even after the introduction of Windows, the underlying differences endured; when a Windows machinegot into trouble, the old command line interface would fall down over the GUI like an asbestos fire curtain sealing off the proscenium of a burning opera. When a Macintosh got into trouble, it presented you with a cartoon of a bomb, which was funny the first time you saw it.
These were by no means superficial differences. The reversion of Windows to a CLI when it was in distress proved to Mac partisans that Windows was nothing more than a cheap facade, like a garish afghan flung over a rotted-out sofa. They were disturbed and annoyed by the sense that lurking underneath Windows’ ostensibly user-friendly interface was—literally—a subtext.
For their part, Windows fans might have made the sour observation that all computers, even Macintoshes, were built on that same subtext, and that the refusal of Mac owners to admit that fact to themselves seemed to signal a willingness, almost an eagerness, to be duped.
Anyway, a Macintosh had to switch individual bits in the memory chips on the video card, and it had to do it very fast and in arbitrarily complicated patterns. Nowadays this is cheap and easy, but in the technological regime that prevailed in the early 1980s, the only realistic way to do it was to build the motherboard (which contained the CPU) and the video system (which contained the memory that was mapped onto the screen) as a tightly integrated whole—hence the single, hermetically sealed case that made the Macintosh so distinctive.
When Windows came out, it was conspicuous for its ugliness, and its current successors, Windows 95, 98, and Windows NT, are not things that people would paymoney to look at either. Microsoft’s complete disregard for aesthetics gave all of us Mac-lovers plenty of opportunities to look down our noses at them. That Windows looked an awful lot like a direct ripoff of MacOS gave us a burning sense of moral outrage to go with it. Among people who really knew and appreciated computers (hackers, in Steven Levy’s nonpejorative sense of that word), and in a few other niches such as professional musicians, graphic artists, and schoolteachers, the Macintosh, for a while, was simply the computer. It was seen as not only a superb piece of engineering, but an embodiment of certain ideals about the use of technology to benefit mankind, while Windows was seen as both a pathetically clumsy imitation and a sinister world domination plot rolled into one. So, very early, a pattern had been established that endures to this day: people dislike Microsoft, which is okay; but they dislike it for reasons that are poorly considered, and in the end, self-defeating.
CLASS STRUGGLE ON THE DESKTOP
Now that the Third Rail has been firmly grasped, it is worth reviewing some basic facts here. Like any other publicly traded, for-profit corporation, Microsoft has, in effect, borrowed a bunch of money from some people (its stockholders) in order to be in the bit business. As an officer of that corporation, Bill Gates has only one responsibility, which is to maximize return on investment. He has done this incredibly well. Any actions taken in the world by Microsoft—any software released by them, for example—are basically epiphenomena, which can’t be interpreted or understood except insofar as they reflect Bill Gates’s execution of his one and only responsibility.
It follows that if Microsoft sells goods that are aesthetically unappealing, or that don’t work very