It turns out driving directly toward huge, looming storm clouds is a great rhetorical device to employ on a road trip to see cloud infrastructure—and also a great way to be faced with the cruel truth of your own mortality.

Somewhere in Arizona, we pulled off the interstate when the combination of hydroplaning and hail became too hazardous. While we waited the storm out, I started to think about the terminology I'd probably have to use for this series.

One of the bad habits of network-infrastructure enthusiasts is we sometimes take for granted that most people don't actually remember that the Internet is composed of objects. We are kind of curmudgeons about this. We are basically that “old man yells at cloud” Simpsons meme, but for IBM commercials.

But, the curmudgeon gruffly acknowledges, most people don't interface with or experience The Cloud as objects, and differentiating between the different types of objects that make up that network isn't always easy. It probably doesn't help that as a whole, these different objects are described in a bunch of mixed metaphors that all kind of work, but don't really work great together.

The other tricky thing about differentiating the various sites and objects that make up the Internet is that they're almost all really boring-looking, windowless buildings with large, complicated HVAC systems. But these buildings do slightly different things. What follows is a brief overview of a few of the most important:

Internet Exchange or Carrier Hotel: Let's take it as a given that the Internet is a network of networks. People and companies running applications connect to the Internet using a variety of service providers (Level 3 Communication, Comcast, Sprint, Verizon). All these networks have to, at some point, talk to each other.

If I want to watch something on Netflix and I'm connected to the Internet via Time Warner, but Netflix is connected to the Internet via Cogent Communications, my HTTP request for netflix.com has to leave Time Warner's network, travel through Cogent's network, and then come back to me via those networks. This happens at an Internet exchange, or IX. Basically, they're buildings where cables and routers connect other networks to each other. Sometimes, they're called carrier hotels, because they're where all the “common carriers,” as it were, “check in” with each other (come for the network infrastructure, stay for the dad-joke terminology).

Internet exchanges are scattered all over the place (TeleGeography maintains a pretty cool map of them), but a lot of them end up in areas where there's a lot of “Internet backbone,” another vocabulary word that infrastructure enthusiasts use but never really explain particularly well. Basically, “Internet backbone” is a word for a really large concentration of network infrastructure (i.e., fiber-optic cables) converging in a particular area due to various political, historical, and environmental conditions. Lots of Internet traffic moves through these areas. It might be more accurate to call Internet backbone “Internet spinal cord” or even “Internet spinal fluid” for biological-metaphor accuracy, but that sounds kind of gross.

There will be a story about a pretty cool Internet exchange later on in this series, but there is a bit more America to travel through first.

Colocation Data Center: Data centers are not in and of themselves cloud infrastructure, and data centers have been a thing long, long before people excitedly talked about The Cloud. The premise of colocation is kind of what it sounds like: Companies put their servers in the same place. In general, co-located servers are hardware that individual companies bring into a data center—they own the equipment, and they put it in a particular data center.

One distinguishing trait of cloud infrastructure compared to vanilla co-location is partly that data doesn't really live on a single server. Databases are distributed across multiple servers, stored in fragments sometimes called “shards” (a term that apocryphal rumor I desperately hope is true attributes to the MMO Ultima Online). If you are using cloud infrastructure and you're not a giant company like Amazon, you usually don't own any of the hardware—it's more like you're renting it. If, in your life, a Man from Sales named Chad (he is, always and forever, named Chad) ever tells you about “platform as a service,” this is literally all he means.

Purpose-Built Data Centers and Retrofitted Data Centers: This is a distinction between basically building an entirely new building from scratch just to be a data center and taking a building previously used for something else and making it a data center. Giant companies that build their own data centers, like Google, often do both. Sometimes they make entirely new buildings from scratch, like one we were driving out to see in Iowa, and sometimes they repurpose old ones, like this paper mill in Finland. There are different tradeoffs to either approach.


This is, admittedly, a broad brush-strokes overview of network infrastructure, and I'd hesitate to really reduce whatever The Cloud is to the objects and buildings we happened to be driving out to see.

Whether we accept the metaphor of The Cloud or install a browser plugin to taunt it, it is part of a fundamental change in how people live with and perceive the Internet. It has less to do with marketing-speak, and more to do with the fact that while waiting out a flash flood in Arizona, I could look on my phone at a radar map of that flash flood. The Cloud facilitates a particular kind of time travel, not one of paradoxical leaps forwards or backwards but a flattening of space-time into a constant real-time now.

Somewhere, in a sleepless haze of writing terminology notes, I'd written “TIME MACHINES OF THE AMERICAN WEST.” When the rain let up, we were about six hours behind schedule, and we had a lot of time machines to go see.