Tools For Thought by Howard Rheingold
April, 2000: a revised edition of Tools for Thought is available from MIT Press including a revised chapter with 1999 interviews of Doug Engelbart, Bob Taylor, Alan Kay, Brenda Laurel, and Avron Barr. ISBN: 9780262681155, 1985 ed, ISBN: 0262681153
The idea that people could use computers to amplify thought and communication, as tools for intellectual work and social activity, was not an invention of the mainstream computer industry or orthodox computer science, nor even homebrew computerists; their work was rooted in older, equally eccentric, equally visionary, work. You can't really guess where mind-amplifying technology is going unless you understand where it came from.
- HLR

index
Chapter One: The Computer Revolution Hasn't Happened Yet
Chapter Two: The First Programmer Was a Lady
Chapter Three: The First Hacker and his Imaginary Machine
Chapter Four: Johnny Builds Bombs and Johnny Builds Brains
Chapter Five: Ex-Prodigies and Antiaircraft Guns
Chapter Six: Inside Information
Chapter Seven: Machines to Think With
Chapter Eight: Witness to History: The Mascot of Project Mac
Chapter Nine: The Loneliness of a Long-Distance Thinker
Chapter Ten: The New Old Boys from the ARPAnet
Chapter Eleven: The Birth of the Fantasy Amplifier
Chapter Twelve: Brenda and the Future Squad
Chapter Thirteen: Knowledge Engineers and Epistemological Entrepreneurs
Chapter Fourteen: Xanadu, Network Culture, and Beyond
Footnotes
Chapter One:
The Computer Revolution Hasn't Happened Yet

South of San Francisco and north of Silicon Valley, near the place where the pines on the horizon give way to the live oaks and radiotelescopes, an unlikely subculture has been creating a new medium for human thought. When mass-production models of present prototypes reach our homes, offices, and schools, our lives are going to change dramatically.

The first of these mind-amplifying machines will be descendants of the devices now known as personal computers, but they will resemble today's information processing technology no more than a television resembles a fifteenth-century printing press. They aren't available yet, but they will be here soon. Before today's first-graders graduate from high school, hundreds of millions of people around the world will join together to create new kinds of human communities, making use of a tool that a small number of thinkers and tinkerers dreamed into being over the past century.

Nobody knows whether this will turn out to be the best or the worst thing the human race has done for itself, because the outcome of this empowerment will depend in large part on how we react to it and what we choose to do with it. The human mind is not going to be replaced by a machine, at least not in the foreseeable future, but there is little doubt that the worldwide availability of fantasy amplifiers, intellectual toolkits, and interactive electronic communities will change the way people think, learn, and communicate.

It looks as if this latest technology-triggered transformation of society could have even more intense impact than the last time human thought was augmented, five hundred years ago, when the Western world learned to read. Less than a century after the invention of movable type, the literate community in Europe had grown from a privileged minority to a substantial portion of the population. People's lives changed radically and rapidly, not because of printing machinery, but because of what that invention made it possible for people to know. Books were just the vehicles by which the ideas escaped from the private libraries of the elite and circulated among the population.

The true value of books emerged from the community they made possible, an intellectual community that is still alive all over the world. The printed page has been a medium for the propagation of ideas about chemistry and poetry, evolution and revolution, democracy and psychology, technology and industry, and many other notions beyond the ken of the people who invented movable type and started cranking out Bibles.

Because mass production of sophisticated electronic devices can lag ten years or more behind the state of the art in research prototypes, the first effects of the astonishing achievements in computer science since 1960 have only begun to enter our lives. Word processors, video games, educational software, and computer graphics were unknown terms to most people only ten years ago, but today they are the names for billion-dollar industries. And the experts agree that the most startling developments are yet to come.

A few of the pioneers of personal computing who still work in the computer industry can remember the birth and the dream, when the notion of personal computing was an obscure heresy in the ranks of the computing priesthood. Thirty years ago, the overwhelming majority of the people who designed, manufactured, programmed, and used computers subscribed to a single idea about the proper (and possible) place of computers in society: "computers are mysterious devices meant to be used in mathematical calculations." Period. Computer technology was believed to be too fragile, valuable, and complicated for nonspecialists.

In 1950 you could count the people who took exception to this dogma on the fingers of one hand. The dissenting point of view shared by those few people involved in a different way of thinking about how computers might be used. The dissenters shared a vision of personal computing in which computers would be used to enhance the most creative aspects of human intelligence — for everybody, not just the technocognoscenti.

Those who questioned the dogma of data processing agreed that computers can help us calculate, but they also suspected that if the devices could be made more interactive, these tools might help us to speculate, build and study models, choose between alternatives, and search for meaningful patterns in collections of information. They wondered whether this newborn device might become a communication medium as well as a calculating machine.

These heretical computer theorists proposed that if human knowledge is indeed power, then a device that can help us transform information into knowledge should be the basis for a very powerful technology. While most scientists and engineers remained in awe of the giant adding machines, this minority insisted on thinking about how computers might be used to assist the operation of human minds in nonmathematical ways.

Tools for Thought focuses on the ideas of a few of the people who have been instrumental in creating yesterday's, today's, and tomorrow's human-computer technology. Several key figures in the history of computation lived and died centuries or decades ago. I call these people, renowned in scientific circles but less known to the public, the patriarchs. Other co-creators of personal computer technology are still at work today, continuing to explore the frontiers of mind-machine interaction. I call them the pioneers.

The youngest generation, the ones who are exploring the cognitive domains we will all soon experience, I call the Infonauts. It is too early to tell what history will think of the newer ideas, but we're going to take a look at some of the things the latest inner-space explorers are thinking, in hopes of catching some clues to what (and how) everybody will be thinking in the near future.

As we shall see, the future limits of this technology are not in the hardware but in our minds. The digital computer is based upon a theoretical discovery known "the universal machine," which is not actually a tangible device but a mathematical description of a machine capable of simulating the actions of any other machine. Once you have created a general-purpose machine that can imitate any other machine, the future development of the tool depends only on what tasks you can think to do with it. For the immediate future, the issue of whether machines can become intelligent is less important than learning to deal with a device that can become whatever we clearly imagine it to be.

The pivotal difference between today's personal computers and tomorrow's intelligent devices will have less to do with their hardware than their software — the instructions people create to control the operations of the computing machinery. A program is what tells the general-purpose machine to imitate a specific kind of machine. Just as the hardware basis for computing has evolved from relays to vacuum tubes to transistors to integrated circuits, the programs have evolved as well. When information processing grows into knowledge processing, the true personal computer will reach beyond hardware and connect with a vaster source of power than that of electronic microcircuitry — the power of human minds working in concert.

The nature of the world we create in the closing years of the twentieth century will be determined to a significant degree by our attitudes toward this new category of tool. Many of us who were educated in the pre-computer era shall be learning new skills. The college class of 1999 is already on its way. It is important that we realize today that those skills of tomorrow will have little to do with how to operate computers and a great deal to do with how to use augmented intellects, enhanced communications, and amplified imaginations.

Forget about "computer literacy" or obfuscating technical jargon, for these aberrations will disappear when the machines and their programs grow more intelligent. The reason for building a personal computer in the first place was to enable people to do what people do best by using machines to do what machines do best. Many people are afraid of today's computers because they have been told that these machines are smarter than they are — a deception that is reinforced by the rituals that novices have been forced to undergo in order to use computers. In fact, the burden of communication should be on the machine. A computer that is difficult to use is a computer that's too dumb to understand what you want.

If the predictions of some of the people in this book continue to be accurate, our whole environment will suddenly take on a kind of intelligence of its own sometime between now and the turn of the century. Fifteen years from now, there will be a microchip in your telephone receiver with more computing power than all the technology the Defense Department can buy today. All the written knowledge in the world will be one of the items to be found in every schoolchild's pocket.

The computer of the twenty-first century will be everywhere, for better or for worse, and a more appropriate prophet than Orwell for this eventuality might well be Marshall McLuhan. If McLuhan was right about the medium being the message, what will it mean when the entire environment becomes the medium? If such development does occur as predicted, it will probably turn out differently from even the wildest "computerized household" scenarios of the recent past.

The possibility of accurately predicting the social impact of any new technology is questionable, to say the least. At the beginning of the twentieth century, it was impossible for average people or even the most knowledgeable scientists to envision what life would be like for their grandchildren, who we now know would sit down in front of little boxes and watch events happening at that moment on the other side of the world.

Today, only a few people are thinking seriously about what to do with a living room wall that can tell you anything you want to know, simulate anything you want to see, connect you with any person or group of people you want to communicate with, and even help you find out what it is when you aren't entirely sure. In the 1990s it might be possible for people to "think as no human being has ever thought" and for computers to "process data in a way not approached by the information-handling machines we know today," as J.C.R. Licklider, one of the most influential pioneers, predicted in 1960, a quarter of a century before the hardware would begin to catch up with his ideas.

The earliest predictions about the impact of computing machinery occurred quite a bit earlier than 1960. The first electronic computers were invented by a few individuals, who often worked alone, during World War II. Before the actual inventors of the 1940s were the software patriarchs of the 1840s. And before them, thousands of years ago, the efforts of thinkers from many different cultures to find better ways to use symbols as tools led to the invention of mathematics and logic. It was these formal systems for manipulating symbols that eventually led to computation. Links in what we can now see as a continuous chain of thought were created by a series of Greek philosophers, British logicians, Hungarian mathematicians, and American inventors.

Most of the patriarchs had little in common with each other, socially or intellectually, but in some ways they were very much alike. It isn't surprising that they were exceptionally intelligent, but what is unusual is that they all seem to have been preoccupied with the power of their own minds. For sheer intellectual adventure, many intelligent people pursue the secrets of the stars, the mysteries of life, the myriad ways to use knowledge to accomplish practical goals. But what the software ancestors sought to create were tools to amplify the power of their own brains — machines to take over what they saw as the more mechanical aspects of thought.

Perhaps as an occupational hazard of this dangerously self-reflective enterprise, or as a result of being extraordinary people in restrictive social environments, the personalities of these patriarchs (and matriarchs) of computation reveal a common streak of eccentricity, ranging from the mildly unorthodox to the downright strange.

  • Charles Babbage and Ada, Countess of Lovelace, lived in the London of Dickens and Prince Albert (and knew them both). A hundred years before some of the best minds in the world used the resources of a nation to build a digital computer, these two eccentric inventor-mathematicians dreamed of building their "Analytical Engine." He constructed a partial prototype and she used it, with notorious lack of success, in a scheme to win a fortune at the horse races.

    Despite their apparent failures, Babbage was the first true computer designer, and Ada was history's first programmer.

  • George Boole invented a mathematical tool for future computer-builders — an "algebra of logic" that was used nearly a hundred years later to link the process of human reason to the operations of machines. The idea came to him in a flash of inspiration when he was walking across a meadow one day, at the age of seventeen, but it took him twenty years to teach himself enough mathematics to write The Laws of Thought.

    Although Boole's lifework was to translate his inspiration into an algebraic system, he continued to be so impressed with the suddenness and force of the revelation that hit him that day in the meadow that he also wrote extensively about the powers of the unconscious mind. After his death Boole's widow turned these ideas into a kind of human potential cult, a hundred years before the "me decade."

  • Alan Turing solved one of the most crucial mathematical problems of the modern era at the age of twenty-four, creating the theoretical basis for computation in the process. Then he became the top code-breaker in the world — when he wasn't bicycling around wearing a gas mask or running twenty miles with an alarm clock tied around his waist. If it hadn't been for the success of Turing's top-secret wartime mission, the Allies might have lost World War II. After the war, he created the field of artificial intelligence and laid down the foundations of the art and science of programming.

    He was notoriously disheveled, socially withdrawn, sometimes loud and abrasive, and even his friends thought that he carried nonconformity to weird extremes. At the age of forty-two, he committed suicide, hounded cruelly by the same government he helped save.

  • John von Neumann spoke five languages and knew dirty limericks in all of them. His colleagues, famous thinkers in their own right, all agreed that the operations of Johnny's mind were too deep and far too fast to be entirely human. He was one of history's most brilliant physicists, logicians, and mathematicians, as well as the software genius who invented the first electronic digital computer.

    John von Neumann was the center of the group who created the "stored program" concept that made truly powerful computers possible, and he specified a template that is still used to design almost all computers — the "von Neumann architecture." When he died, the Secretaries of Defense, the Army, Air Force, and Navy and the Joint Chiefs of staff were all gathered around his bed, attentive to his last gasps of technical and policy advice.

  • Norbert Wiener, raised to be a prodigy, graduated from Tufts at fourteen, earned his Ph.D. from Harvard at eighteen, and studied with Bertrand Russell at nineteen. Wiener had a different kind of personality than his contemporary and colleague, von Neumann. Although involved in the early years of computers, he eventually refused to take part in research that could lead to the construction of weapons. Scarcely less brilliant than von Neumann, Wiener was vain, sometimes paranoid, and not known to be the life of the party, but he made important connections between computers, living organisms, and the fundamental laws of the physical universe. He guarded his ideas and feuded with other scientists, writing unpublished novels about mathematicians who did him wrong.

    Wiener's conception of cybernetics was partially derived from "pure" scientific work in mathematics, biology, and neurophysiology, and partially derived from the grimly applied science of designing automatic antiaircraft guns. Cybernetics was about the nature of control and communication systems in animals, humans, and machines.

  • Claude Shannon, another lone-wolf genius, is still known to his neighbors in Cambridge, Massachusetts, for his skill at riding a motorcycle. In 1937, as a twenty-one-year-old graduate student, he showed that Boole's logical algebra was the perfect tool for analyzing the complex networks of switching circuits used in telephone systems and, later, in computers. During the war and afterward, Shannon established the mathematical foundation of information theory. Together with cybernetics, this collection of theorems about information and communication created a new way to understand people and machines — and established information as a cosmic fundamental, along with energy and matter.
The software patriarchs came from wildly different backgrounds. Then as now, computer geniuses were often regarded as "odd" by those around them, and their reasons for wanting to invent computing devices seem to have been as varied as their personalities. Something about the notion of a universal machine enticed mathematicians and philosophers, logicians and code-breakers, whiz kids and bomb-builders. Even today, the worlds of computer research and the software business bring together an unlikely mixture of entrepreneurs and evangelists, futurians and utopians, cultists, obsessives, geniuses, pranksters, and fast-buck artists.

Despite their outward diversity, the computer patriarchs of a hundred years ago and the cyberneticians if the World War II era appear to have shared at least one characteristic with each other and with software pioneers and infonauts of more recent vintage. In recent years, the public has become more aware of a subculture that sprouted in Cambridge and Palo Alto and quietly spread through a national network of fluorescent-lit campus computer centers for the past two decades — the mostly young, mostly male, often brilliant, sometimes bizarre "hackers," or self-confessed compulsive programmers. Sociologists and psychologists of the 1980s are only beginning to speculate about the deeper motivation for this obsession, but any later-day hacker will admit that the most fascinating thing in his own life is his own mind, and tell you that he regards intense, prolonged interaction with a computer program as a particularly satisfying kind of dialogue with his own thoughts.

A little touch of the hacker mentality seems to have affected all of the major players in this story. From what we know today about the patriarchs and pioneers, they all appear to have pursued a vision of a new way to use their minds. Each of them was trying to create a mental lever. Each of them contributed indispensable components of the device that was eventually assembled. But none of them encompassed it all.

The history of computation became increasingly complex as it progressed from the patriarchs to the pioneers. At the beginning, many of the earliest computer scientists didn't know that their ideas would end up in a kind of machine. Almost all of them worked in isolation. Because of their isolation from one another, the common intellectual ancestors of the modern computer are relatively easy to discern in retrospect. But since the 1950s, with the proliferation of researchers and teams of researchers in academic, industrial, and military institutions, the branches of the history have become tangled and too numerous to describe exhaustively. Since the 1950s, it has become increasingly difficult to assign credit for computer breakthroughs to individual inventors.

Although individual contributors to the past two or three decades of computer research development have been abundant, the people who have been able to see some kind of overall direction to the fast, fragmented progress of recent years have been sparse. Just as the earliest logicians and mathematicians didn't know their thoughts would end up as a part of a machine, the vast majority of the engineers and programmers of the 1960s were unaware that their machines had anything to do with human thought. The latter day computer pioneers in the middle chapters of this book were among the few who played central roles in the development of personal computing. Like their predecessors, these people tried to create a kind of mental lever. Unlike most of their predecessors, they were also trying to design a tool that the entire population might use.

Where the original software patriarchs solved various problems in the creation of the first computers, the personal computer pioneers struggled with equally vexing problems involved in using computers to create leverage for human intellect, the way wheels and dynamos create leverage for human muscles. Where the patriarchs were out to create computation, the pioneers sought to transform it:

  • J.C.R. Licklider, an experimental psychologist at MIT who became the director of the Information Processing Techniques Office of the U.S. Defense Department's Advanced Research Projects Agency (ARPA), was the one man whose vision enabled hundreds of other like-minded computer designers to pursue a whole new direction in hardware and software development. In the early 1960s, the researchers funded by Licklider's programs reconstructed computer science on a new and higher level, through an approach known as time-sharing.

    Although their sponsorship was military, the people Licklider hired or supported were working toward a transformation that he and they believed to be social as well as technological. Licklider saw the new breed of interactive computers his project directors were creating as the first step toward an entirely new kind of human communication capability.

  • Doug Engelbart started thinking about building a thought-amplifying device back when Harry Truman was President, and he has spent the last thirty years stubbornly pursuing his original vision of building a system for augmenting human intellect. At one point in the late 1960s, Engelbart and his crew of infonauts demonstrated to the assembled cream of computer scientists and engineers how the devices most people then used for performing calculations or keeping track of statistics could be used to enhance the most creative human activities.

    His former students have gone on to form a disproportionate part of the upper echelons of today's personal computer designers. Partially because of the myopia of his contemporaries, and partially because of his almost obsessive insistence on maintaining the purity of his original vision, most of Engelbart's innovations have yet to be adapted by the computer orthodoxy.

  • Robert Taylor, at the age of thirty-three, became the director of the ARPA office created by Licklider, thus launching his career in a new and much-needed field — the shaping of large-scale, long term, human-computer research campaigns. He became a "people collector," looking for those computer researchers whose ideas might have been ignored by the orthodoxy, but whose projects promised to boost the state of computer systems by orders of magnitude.
  • Alan Kay was one of television's original quiz kids. He learned to read at the age of two and a half, barely managed to avoid being thrown out of school and the Air Force, and ended up as a graduate student at one of the most important centers of ARPA research. In the 1970s, Kay was one of the guiding software spirits of PARC's Alto project (the first true personal computer) and the chief architect of Smalltalk, a new kind of computer language. He started the 1980s as a director of Atari Corporation's long-term research effort, and in 1984 he left Atari to become a "research fellow" for Apple Corporation.

    Along with his hard-won credentials as one of the rare original thinkers who is able to implement his thoughts via the craft of software design, Kay also has a reputation as a lifelong insubordinate. Since the first time he was thrown out of a classroom for knowing more than the teacher, Kay's avowed goal has been to build a "fantasy amplifier" that anyone with an imagination could use to explore the world of knowledge on their own, a "dynamic medium for creative thought" that could be as useful and thought-provocative to children in kindergarten as it would be to scientists in a research laboratory.

Licklider, Engelbart, Taylor, and Kay are still at work, confident that many more of us will experience the same thrill that has kept them going all these years — what Licklider, still at MIT, calls the "religious conversion" to interactive computing. Engelbart works for Tymshare Corporation, marketing his "Augment" system to information workers. Taylor is setting up another computer systems research center, this time under the auspices of the Digital Equipment Corporation, and is collecting people once again, this time for a research effort that will bring computing into the twenty-first century. Kay, at Atari, continued to steer toward the fantasy amplifier, despite the fact that their mother company was often described in the news media as "seriously troubled." It is fair to assume that he will continue to work toward the same goal in his new association with Steve Jobs, chairman of Apple and a computer visionary of a more entrepreneurial bent.

The pioneers, although they are still at work, are not the final characters in the story of the computer quest. The next generations of innovators are already at work, and some of them are surprisingly young. Computer trailblazers in the past tended to make their marks early in life — a trend that seems to be continuing in the present. Kay, the former quiz kid, is now in his early forties. Taylor is in his early fifties, Engelbart in his late fifties, and Licklider in his sixties. Today, younger men and, increasingly, younger women, have begun to take over the field professionally, while even younger generations are now living in their own versions of the future for fun, profit, and thrills.

The ones I call the "infonauts" are the older brothers and sisters of the adolescent hackers you read about in the papers. Most of them are in their twenties and thirties. They work for themselves or for some research institution or software house, and represent the first members of the McLuhan generation to use the technology invented by the von Neumann generation as tools to extend their imagination. From the science of designing what they call the "user interface" — where mind meets machine — to the art of building educational microworlds, the infonauts have been using their new medium to create the mass-media version we will use fifteen years from now.

  • Avron Barr is a knowledge engineer who helps build the special computer programs known as expert systems that are apparently able to acquire knowledge from human experts and transfer it to other humans. These systems are now used experimentally to help physicians diagnose diseases, as well as commercially to help geologists locate mineral deposits and to aid chemists in identifying new compounds.

    Although philosophers debate whether such programs truly "understand" what they are doing, and psychologists point out the huge gap between the narrowly defined kind of expertise involved in geology or diagnosis and the much more general "world knowledge" that all humans have, there is no denying that expert systems are valuable commodities. Avron Barr believes that they will evolve into more than expensive encyclopedias for specialists. In his mid-thirties and just starting his career in an infant technology, he dreams of creating an expert assistant in the art of helping people agree with one another.

  • Brenda Laurel, also in her mid-thirties, is an artist whose medium exists at the boundary of Kay's and Barr's and Engelbart's specialties. Her goal is to design new methods of play, learning, and artistic expression into computer-based technologies. Like Barr, she believes that the applications of her research point toward more extensive social effects than just another success in the software market.

    Brenda wants to use an expert system that knows what playwrights, composers, librarians, animators, artists, and dramatic critics know, to create a world of sights and sounds in which people can learn about flying a spaceship or surviving in the desert or being a blue whale by experiencing space-desert-whale simulated microworlds in person.

  • Ted Nelson is a dropout, gadfly, and self-proclaimed genius who self-published Computer Lib, the best-selling underground manifesto of the microcomputer revolution. His dream of a new kind of publishing medium and continuously updated world-library threatens to become the world's longest software project. He's wild and woolly, imaginative and hyperactive, has problems holding jobs and getting along with colleagues, and was the secret inspiration to all those sub-teenage kids who lashed together homebrew computers or homemade programs a few years back and are now the ruling moguls of the microcomputer industry.

    Time will tell whether he is a prophet too far ahead of his time, or just a persistent crackpot, but there is no doubt that he has contributed a rare touch of humor to the often too-serious world of computing. How can you not love somebody who says "they should have called it an oogabooga box instead of a computer"?

Despite their differences in background and personality, the computer patriarchs, software pioneers, and the newest breed of infonauts seem to share a distant focus on a future that they are certain the rest of us will see as clearly as they do — as soon as they turn what they see in their mind's eye into something we can hold in our hands. What did they see? What will happen when their visions materialize in our homes? And what do contemporary visionaries see in store for us next?
| index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | Footnotes |
read on to
Chapter Two:
The First Programmer Was a Lady

howard rheingold's brainstorms

1985 howard rheingold, all rights reserved worldwide.