Starting a cross-country drive to New York in Los Angeles is pretty inconvenient, unless your cross-country drive is also a vision quest to see the Internet. More specifically, we'd been tasked with going to see “the cloud”—a term I've generally been loath to embrace, as I tend to think of it as a pernicious metaphor encouraging unrealistic collective fictions. Which, as someone who actively chooses to live in New York, is kind of how I imagine Los Angeles.
But we didn't come to L.A. just to ensure that this road trip had adequate Pynchonesque vibes. We came to L.A. because we wanted to start the drive where the cloud started—and we decided that meant going to where the Internet started, which, in turn, meant going down a rabbit hole of debated histories that, for better or worse, deposited us in room 3420 at the University of California, Los Angeles's Boetler Hall. This was the home of UCLA's Network Measurement Center, which, between 1969 and 1975, served as one of the first nodes of the ARPANET.
The problem that cloud computing seeks to solve isn't really all that radically different from the ones that led to the development of ARPANET. According to the National Institute of Standards and Technology, cloud computing is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources … that can be rapidly provisioned and released with minimal management effort or service-provider interaction.” That doesn't actually sound all that different from the objectives of time-sharing, a concept that emerged in the mid-1950s whose name is largely credited to the computer scientist John McCarthy.