Here we are in an era in which prevailing cultural attitudes toward technology are deeply at odds with how that technology actually behaves. While people marvel or sigh at computing systems with a mix of reverence and fear, Arbesman writes, they fail to appreciate that technology’s messy imperfections are both inevitable and, to some extent, comprehensible.
At the same time, we’re being forced to confront a kind of radical novelty in technology, a seemingly inexorable push toward complexity that the theoretical physicist and computer scientist Edsger Dijkstra once described as “conceptual hierarchies that are much deeper than a single mind ever needed to face before.” That was in 1988. Three decades later, the technological world is far more intricate still. As a result, almost everything humans do in the technological realm, Arbesman writes, “seems to lead us away from elegance and understandability, and toward impenetrable complexity and unexpectedness.” We’re living, he says, in an age of Entanglement.*
“Most people think about understanding as a binary condition,” Arbesman told me in an interview. “Either you understand things completely or not at all.” That viewpoint is dangerous when it’s applied to technology today, because there’s simply no way to understand everything. (Or, as Arbesman puts it in his book: “The vast majority of computer programs will never be thoroughly comprehended by any human being.”) Instead, he argues, people should be acting as technological naturalists, approaching complex digital systems the way a biologist would examine living systems. Doing so will require people to rethink what it means to understand technology, and at what scale:
When attempting to understand a complex system, we must determine the proper resolution, or level of detail, at which to look at it. How fine-grained a level of detail are we focusing on? Do we focus on the individual enzyme molecules in a cell of a large organism, or do we focus on the organs and blood vessels? Do we focus on the binary signals winging their way through circuitry, or do we examine the overall shape and function of a computer program? At a larger scale, do we look at the general properties of a computer network, and ignore the individual machines and decisions that make up this structure?
None of these questions has a straightforward answer, and yet the tendency not to pose them at all, Arbesman cautions, has left humans in a perilous and vulnerable place.
Abstraction in computing—and the elegance of interfaces like the ones that make MacBooks and iPhones so user friendly, for instance—has made machines delightful and easy to use, but has also created a huge gap in comprehension that didn’t exist in the early days of personal computing. (In the beginning, if you wanted to mess around with a computer, you had to learn to speak its language; not the other way around.) One of the best illustrations of this change is embodied by the progress bar, an example Arbesman gives in his book as a “small interface innovation” designed to soothe computer users with what seems like a small window into the opaque process of updating software. “Think back to the last time you installed a new piece of software,” Arbesman writes. “Did you know what was going on? Did you clearly understand where various packages were being placed in the vast hierarchy of folders on your hard drive, and what bits of information were being modified based on the specific nature of your computer and its operating system? Unlikely.”