All My Friends Live In My Phone and So Do I Part III: Two Houses

Part III: Two Houses 

Image by Marcin Wichary (Creative Commons)

by Michael Malloy

“And it’s ho, boys, can’t you code it and program it right, nothing ever happens in this life of mine, I’m hauling up the data on the Xerox line” Stan Rogers

The establishment of the new role, the consumer of computers for the home, was only made possible by the proliferation of new form factors and roles for computers. The demands of institutions and business became less central to the producers of home computers. Although computers had been shrinking even before the mid-1970s, this process had not yet resulted in home computers that were reasonably powerful devices on their own. The PDP-11 and its descendant, the VAX, developed by the Digital Equipment Corporation, were in tight competition with the IBM System/360 series. They came to define the era of the “minicomputer,” where computers were the size of cabinets, humming comfortably in a room with a few staff, typically printing to typewriters and using long spools of magnetic tape storage encased within dignified grandfather-clock style reel-to-reel machines. But even these smaller beasts were outstripped by their smaller, more lightweight successors: the microcomputer. These were destined for greatness—small enough to fit on a desk, the majority could output to a standard television set, and they offered just enough memory and functionality to justify their purchase by a household. The computer joined the landline and TV as household appliances.

This was the era where the now-familiar brands of computers emerged: Apple, Microsoft, HP, Intel. But two computers, more than any other, defined this era: The Commodore 64 and the IBM PC. Two utterly divergent sets of design principles guided the development of these two machines, responding to contradictory demands from this newly minted consumer market.

IBM hardly needs introduction. A longtime giant in the computing industry, they had their fingers in every pie. Typewriters? The IBM Selectric was an industry favorite. Mainframes? IBM’s System/360 dominated the class with a near-monopoly. Government work? IBM ecumenically collaborated with the Nazis on punch card systems for death camps while helping the US government establish Social Security. They were everywhere, they were monolithic, and they were relentless. No more amoral a company existed.

Commodore was the underdog, a newcomer to the computer industry, founded by the fanatically savvy Jack Tramiel. Tramiel’s life encompassed a cross-section of the 20th century. Born to a Jewish family in pre-war Poland, he and his family were deported to the Litzmannstadt ghetto in Łódź, then interned at Auschwitz. While his father perished in the camp, he survived. He moved to New York, served in the US Army, and worked as a taxi driver before he created his first company using a US Army loan: “Commodore Portable Typewriter.” He then created a shell company in Canada to import components from Czechoslovakia, which he then shipped to Australia to take advantage of “preferential tariff conditions.” Later, he expanded the business to calculators and ended the production of the now-outmatched typewriters, struggling to find footing against Japanese imports. Commodore was not the leader in their field by any stretch of the imagination, but a fortuitous chain of events delivered Tramiel to the precipice of a truly historic moment.

After the purchase of semiconductor manufacturer MOS Technology, a major producer of his calculator components, Tramiel took the advice of MOS’s head designer, Chuck Peddle, and made his first foray into computers, building off MOS’s previous success, the KIM-1. The KIM itself was not a particularly useful computer. It had no keyboard, just buttons to key in hexadecimal values, and a tiny 7-segment display on a bare circuit board. It was merely the frame surrounding MOS’s masterpiece, a processor ahead of its time: the MOS Technology 6502.  It was the cheapest in its class, faster than the competition, and its design made it applicable to a wide range of uses that made it an instant hit. Its introduction rippled across the computing world. Atari, Apple, Nintendo, and even the BBC adopted the chip, and competitors found themselves paying for the privilege to use it. MOS, now under Commodore’s control, held all the cards, and each plotted their next move.

IBM and Commodore’s computers hit the market within months of each other. In August 1981, IBM released the IBM Personal Computer, a chunky cuboid that truly embodied the name “business machine.” Priced towards professionals looking for a serious computer for serious work, the IBM PC cost about $4,200 (adjusted for inflation), and was a decidedly un-fun machine. Wisely, it did crib the Apple II’s open architecture for expansions, allowing a large secondary market to emerge, which would propel the PC-standard and PC-compatible devices to nearly monopolize the market. It sold very well, and its successors, the AT, the XT and subsequent clones came to define the desktop market as we know it.

The Commodore 64 was a different beast entirely. While the IBM PC was aimed at the professional class, the C64 was a computer for the masses. Adjusted for inflation, it cost about $1,500—less than a third than that of its competitors. It was produced from January 1982 to April 1994, with only three minor revisions to its internal design over this period. It sold 17 million units, more than any home computer model before or since. It outsold the IBM PC for two years. It was an astronomical rise to dominance, with homespun commercials featuring William Shatner and an enormous and growing market of bespoke peripherals and accompanying gadgets. In 1986 it even got a Macintosh-style graphic operating system. It was in schools, it was in homes, and, more importantly, its iconic blue power-up screen marked the first time many children and students had ever encountered an actual computer. It reassured you that it was “READY.” as its blinking cursor waited expectantly. Like with many home computers at the time, you didn’t get a disc operating system to tool around with. You had to program it, and masses of young computer users did.

This was a significant accomplishment at the time. Until the minicomputer had saturated schools and classrooms, programming was not seen as a skill with a great degree of relevance outside of a technical environment. There was a push to introduce programming classes to K-8 education, and the future  for young programmers looked bright, as the skills would carry over to the world of the future, where everyone would need to program things, where we would integrate our lives into the world of computers.

This future never really materialized in the way that was predicted. Instead, IBM earned a pyrrhic victory over Commodore and Apple, their PC’s business-ready environment vaulting them over their competitor’s computers, which were increasingly seen as niche and subcultural. IBM and their lesser imp Microsoft’s dirty dealings are, of course, legendary in scope and shamelessness, but even the mighty fall. IBM’s ill-fated OS/2 operating system, the graphical upgrade to the standard MS-DOS, was betrayed in 1986 by their then-partner Microsoft, abandoned in pursuit of their own graphical user interface: Microsoft Windows. IBM slowly retreated from the market they had created, leaving a host of “beige boxes” to fight for supremacy in the PC-compatible market.

And so we return to the devices of the modern era. Instead of following down this path, one of greater comprehension and understanding of the devices in our lives, we have been alienated from them through a variety of means. This dream of the late 1980s and early 1990s, of a world where programming would bring us closer to a liberation-through-device, a dream that we could take advantage of this wondrous new technology and develop a society-wide understanding of its operation, died in the crib. It was killed by the same death knell that comes for all exciting technologies, all breakthroughs and developments and ingenious innovations: normalization under capitalism.

Michael Malloy is a student teacher, ecological researcher, and socialist from California’s Central Valley. He learned to program on a Commodore 64 and made a text-based adventure revolving around an Italian butcher escaping jail.