In the June/July issue of the Atlantic Nicholas Carr poses a disturbing question: “Is Google Making Us Stupid?” More specifically, Carr wonders whether the modern tendency to consume information online, through a constant stream of headlines, e-mails, and blog posts has eroded our capacity for deep, measured thought.
To illustrate the point Carr describes his own recent struggles with reading:
Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
It sounds like the realization of a dystopian prophecy: the Internet, the product of decades’ worth of humanity’s genius and innovation, has now turned against the human mind. How did we get here? Several Atlantic articles over the past decades have chronicled our march towards the information age, and each provides keen insight into the evolving relationship between humanity and technology.
In July 1945, The Atlantic published an essay by Vannevar Bush entitled “As We May Think.” In many important ways, this seminal article laid the theoretical groundwork for the information revolution. Bush, who directed the Office of Scientific Research and Development during WWII, realized that the rapidly growing body of human knowledge would be of limited value to future generations without more efficient means of accessing it:
Thus far we seem to be worse off than before—for we can enormously extend the record; yet even in its present bulk we can hardly consult it. This is a much larger matter than merely the extraction of data for the purposes of scientific research; it involves the entire process by which man profits by his inheritance of acquired knowledge.
Bush believed that resolving this dilemma should be the principle task of scientists in the post-War era. He envisioned a system for searching the realm of human knowledge through “selection by association”—a mechanized process that would seek to emulate the way the human mind thinks.
In an especially prescient passage, Bush imagined a machine of the future called a “memex” that would employ the principle of selection by association:
A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility…
It affords an immediate step to associative indexing, the basic idea of which is a provision whereby any item may be caused at will to select immediately and automatically another. This is the essential feature of the memex. The process of tying two items together is the important thing.
Bush proceeded to describe how a memex user could, for example, call up detailed information on the history of the bow and arrow—and even record his own thoughts on the subject. To the modern reader, Bush’s memex sounds eerily familiar. Indeed, Bush even foresaw the birth of Wikipedia when he speculated that “wholly new forms of encyclopedias will appear…ready to be dropped into the memex and there amplified.”
Two decades later, in a May 1964 article “The Computers of Tomorrow,” MIT professor Martin Greenberger, a pioneer in the new field of computer science, paid tribute to “the remarkable clarity of Dr. Bush’s vision” and then went on to make his own startlingly accurate predictions. At the time of his article, the computer inhabited the limited worlds of scientific research and large industry, but Greenberger foresaw its entrance in everyday life:
General economic and political conditions permitting, this work will nourish a new wave of computer expansion. Computing services and establishments will begin to spread throughout every sector of American life, reaching into homes, offices, classrooms, laboratories, factories, and businesses of all kinds.
Like Bush, Greenberger’s crucial insight involved the applications of new technology to the realm of information. He conceived of the computer fundamentally as an “information utility” that would allow “an increasing percentage of the day-to-day functioning of man, the economy, and society [to] become documented and mechanically recorded in easily accessible form.”
The advent of the personal computer some years later finally allowed the layperson to engage directly with the very technology Bush and Greenberger believed would revolutionize our lives. James Fallows’ 1982 Atlantic article “Living with a Computer” offered a glimpse into the potential wonders—and pitfalls—of the early PC. Reared on the typewriter, Fallows marveled at the speed of his new computer’s word processor:
When I sit down to write a letter or start the first draft of an article, I simply type on the keyboard and the words appear on the screen…It is faster to type this way than with a normal typewriter, because you don’t need to stop at the end of the line for a carriage return…and you never come to the end of the page, because the material on the screen keeps sliding up to make room for each new line.
While clearly smitten with his new device, the author also recognized the hazards of relying on it too heavily. For one thing, Fallows warned, “Computers forever distort your sense of time.” When there were occasional delays, “ten minutes was intolerable when everything else happened in a flash.”