After coordinating scientific research for the United States during World War II, including initiating the Manhattan Project, the engineer Vannevar Bush set his sights on a pacifist instrument for world knowledge.
In the July 1945 issue of The Atlantic, Bush outlined his vision for a head-mounted camera attached to “a pair of ordinary glasses” that would record comments, photographs, and data from scientific experiments: “One can now picture a future investigator in his laboratory. His hands are free, and he is not anchored.” His “camera … of the future,” no “larger than a walnut,” worn on “a pair of ordinary glasses … where it is out of the way of ordinary visions” was in many ways a forerunner of today’s augmented-reality devices.
For decades we’ve been inching closer to popular augmented-reality technologies to enhance the physical world—each new iteration promising to turn the entire world into a computing interface—but only in the past couple of years have headsets no longer needed to be enormous, bulky, and expensive, and superimposed images advanced beyond thin lines.
* * *
Coined in the 1990s, “augmented-reality” describes any technology that overlays digital interfaces onto the physical world. Unlike virtual reality, which immerses you in a simulation using stereoscopic 3D on a screen in front of your eyes, augmented-reality technologies embed opaque holograms directly into the environment. As early as 1990, assembly workers at Boeing were wearing see-through head displays that superimposed computerized images of where to place the wires on the 777 aircraft, which saved them from looking back and forth at their manuals. Thad Starner, now a professor of computing at Georgia Institute of Technology, was in 1993 one of the first people to don homemade computing devices with searchable text and recording functions—much like Bush imagined and long before the tech was fashionable or feasibly compact.