Hundreds of thousands of air travelers were delayed by a major, system-wide network outage at Delta on Monday morning, a problem that’s becoming increasingly common in a world run by interconnected and aging computer systems.

Even as flights resumed, Delta couldn’t immediately say what caused the outage. It’s possible the company may never fully know. When a similar blackout grounded United Airlines flights last summer, the airline eventually pinpointed the cause as a router issue that degraded network connectivity for various key applications. But understanding and correcting the cause of one kind of outage doesn’t protect against another. Just one month before the router issue, United had to halt flights due to “automation issues.”

Such large-scale technological failures aren’t just massively inconvenient, they’re potentially dangerous, especially as machines increasingly handle crucial operations across a variety of industries. Complex systems are redefining the ways in which humans think about and interact with technology, a dramatic shift in perspective that poses its own risks. That’s the argument at the heart of Samuel Arbesmn’s new book, Overcomplicated.

“When the world we have created is too complicated for our humble human brains, the nightmare scenario is not Skynet—the self-aware network declaring war on humanity—but messy systems so convoluted that nearly any glitch you can think of (and many you can’t) can and will happen,” writes Arbesman. “Complexity brings the unexpected, but we realize it only when something goes wrong.”

Here we are in an era in which prevailing cultural attitudes toward technology are deeply at odds with how that technology actually behaves. While people marvel or sigh at computing systems with a mix of reverence and fear, Arbesman writes, they fail to appreciate that technology’s messy imperfections are both inevitable and, to some extent, comprehensible.

At the same time, we’re being forced to confront a kind of radical novelty in technology, a seemingly inexorable push toward complexity that the theoretical physicist and computer scientist Edsger Dijkstra once described as “conceptual hierarchies that are much deeper than a single mind ever needed to face before.” That was in 1988. Three decades later, the technological world is far more intricate still. As a result, almost everything humans do in the technological realm, Arbesman writes, “seems to lead us away from elegance and understandability, and toward impenetrable complexity and unexpectedness.” We’re living, he says, in an age of Entanglement.*​

“Most people think about understanding as a binary condition,” Arbesman told me in an interview. “Either you understand things completely or not at all.” That viewpoint is dangerous when it’s applied to technology today, because there’s simply no way to understand everything. (Or, as Arbesman puts it in his book: “The vast majority of computer programs will never be thoroughly comprehended by any human being.”) Instead, he argues, people should be acting as technological naturalists, approaching complex digital systems the way a biologist would examine living systems. Doing so will require people to rethink what it means to understand technology, and at what scale:

When attempting to understand a complex system, we must determine the proper resolution, or level of detail, at which to look at it. How fine-grained a level of detail are we focusing on? Do we focus on the individual enzyme molecules in a cell of a large organism, or do we focus on the organs and blood vessels? Do we focus on the binary signals winging their way through circuitry, or do we examine the overall shape and function of a computer program? At a larger scale, do we look at the general properties of a computer network, and ignore the individual machines and decisions that make up this structure?

None of these questions has a straightforward answer, and yet the tendency not to pose them at all, Arbesman cautions, has left humans in a perilous and vulnerable place.

Abstraction in computing—and the elegance of interfaces like the ones that make MacBooks and iPhones so user friendly, for instance—has made machines delightful and easy to use, but has also created a huge gap in comprehension that didn’t exist in the early days of personal computing. (In the beginning, if you wanted to mess around with a computer, you had to learn to speak its language; not the other way around.) One of the best illustrations of this change is embodied by the progress bar, an example Arbesman gives in his book as a “small interface innovation” designed to soothe computer users with what seems like a small window into the opaque process of updating software. “Think back to the last time you installed a new piece of software,” Arbesman writes. “Did you know what was going on? Did you clearly understand where various packages were being placed in the vast hierarchy of folders on your hard drive, and what bits of information were being modified based on the specific nature of your computer and its operating system? Unlikely.”

Not only are users shielded from complexity, but systems themselves are orders of magnitude more complex than their predecessors. As a result, there’s an overwhelming cultural tendency to outsource technological expertise—to assume that someone out there understands the complexities of machine systems—so that you don’t have to.

“But there are many situations where no one understand these things,” Arbesman told me. “We can no longer feel that we can just pass off understanding to someone else. It’s incumbent on us to have a better understanding of these systems.”

Arbesman is not saying you need to dismantle your iPhone and build it from scratch, or only use apps that you created yourself. (Although, hey, if that’s your thing, great.) But he is saying that active curiosity—and a certain degree of futzing with the technological systems we encounter—is culturally overdue. “The need to have a more calm, tinkering approach to technologies is going to be very, very important,” he said. “I think also the idea that we need to build in the understanding from the outside that our systems are going to be buggy.”

“Especially now, if people realize more explicitly that we’re in this new age of incomprehensibility,” he added. “That abdication of responsibility is too easy, to say, ‘Oh, I don’t know what I’m doing. Everything is magical and I don’t understand it.’”

There’s also nothing magical, by the way, about the approach that Arbesman is advocating. On the individual level, taking on a more actively engaged level of curiosity about one’s interactions with technology isn’t going to prevent the next massive airline systems outage. But maybe, in concert with an emphasis on making and collaborating and bug reporting and embracing other values of the open web, individuals can help reorient the cultural attitude toward technology away from entanglement and back to a place of enlightenment.