How Early Computer Games Influenced Internet Culture

Virtual playspaces of the 1980s encouraged openness and creativity, which would later become foundational values of the web.

"Batman: The Caped Crusader," originally released in 1988, on the Apple IIc  (Matthew Pearce / Flickr)

Sometimes it seems the Internet is, at its core, a tremendous nostalgia machine. You are, at any given moment, just a few clicks and keystrokes away from local television that aired 40 years ago, from discontinued toys, and from sounds you haven’t heard in forever (or at all).

It seems fitting, if not outright magical, for example, that the immersive virtual worlds of my youth—computer games like Lemonade Stand (1979), The Oregon Trail (1979),  Choplifter (1982), Carmen Sandiego (1985), and Think Quick! (1987)—are all playable online.

Screenshot from an online emulator of 1985 edition of The Oregon Trail  (Internet Archive)

Back when I was playing these games on my family’s Apple IIc, they were often side-eyed by a generation that had first encountered home computers as adults—and distrusted gaming as a waste of time. If anything destroys the practice of reading once and for all, The New York Times warned in 1994, it will be computer games.

Computer games did not destroy reading.

Instead, games had a profound effect on the popular perception of computers. The ginormous machines of the 1960s and 1970s were viewed as either complex and boring or, in science fiction, threatening. “I think the fact that computers were primarily used to play games really helped to get people to accept that computers were good and helpful devices instead of the negative portrayals of them in the ‘60s and ‘70s in movies,” said John Romero, the co-founder of id Software and the designer of several hugely popular games including Wolfenstein 3D (1992), Doom (1993), and Quake (1996).

Screenshot from Choplifter (Internet Archive)

Games also shaped people’s understanding of what computers are for—and what humankind’s relationship with such machines could be like. Computers were serious tools, yes, but they could enable exploration and experimentation, too. Computers weren’t just expensive calculators or advanced typewriters; they were fun.

“I strongly believe that games have been largely underrated in the spread of what we might think of as our ‘orientation’ toward computing,” said Laine Nooney, a cultural historian of video games and computing. “In the span of less that 20 years, many Americans went from having never seen a computer to interacting with these machines in many facets of their daily lives. Gaming is the first form of computational technology most of us ever handled … Games taught us principles of interaction and screen responsiveness, about coordination between hand and eye, how to type, how to sit, how to look at a screen.”

Screenshot from an online emulator of the 1985 edition of Carmen Sandiego (Internet Archive)

Early games also served a crucial function in the shaping the broader culture of computing—including promoting central ideas about openness and sharing that would eventually become foundational values of the web. In the late 1970s and early 1980s, game makers—like anyone who found themselves tinkering with computers at the time—were inclined to share what they learned, and to build on one another’s designs.  This dynamic was, Nooney says, an “outgrowth of the general need for a decently sized but geographically disparate group of microcomputer users to trade knowledge.”

The thriving culture of early computing magazines was a direct byproduct of this need—people helped debug each other’s code, swapped tips about hardware, and generally supported one another. “It wasn't uncommon for for someone who had written in with a question and included their address to receive a phone call from another computer user helping them with their question—since all another reader had to do was get the operator on the phone and request a lookup by name and town,” Nooney said.

Screenshot from an online emulator of Lemonade Stand. (Internet Archive)

“The Apple II is really a product of the hacker culture which originally came from universities—they were the only organizations able to afford mainframes and later minicomputers but still allow freedom to play and experiment,” said Bill Budge, the legendary game designer who created Raster Blaster (1981) and other early games. “Certainly, the Apple II exposed more people to computers than was possible at universities. When I started at Google, I was surprised by how many engineers had played my games as kids and been inspired to pursue tech at a very early age. So the Apple II was definitely an accelerator of hacker culture.”

That same culture, and the premium it placed on openness, would eventually carry over to the early web: a platform that anyone could build on, that no one person or company could own. That idea is at the heart of what proponents for net neutrality are trying to protect—that is, the belief that openness is a central value, perhaps even the foundational value, of what is arguably the most important technology of our time.

But there isn’t a straight line from early computing culture to early web culture—even though many of the same players were involved in early game development and early online communities. In fact, in the mid-1980s, at a time when many Americans were buying personal computers for the first time, the open culture that once permeated gaming (and computing in general) changed dramatically.

“Nintendo arrives on the scene just about the time the Macintosh does, and it has a similar effect on computing by advancing the idea that closed systems—designed systems that users don’t fool around with—are the way to go,” said Henry Lowood, a historian of technology at Stanford University Libraries.

It wasn’t just Nintendo or Apple. The open culture at Atari had already begun to shift, for instance, when it was acquired by Warner Communications in the late 1970s. At the time, Warren Robinett, who designed Adventure for Atari’s 2600—the first graphical video game of its kind—was told not to work on the game by supervisors who didn’t think the concept would be possible, given the memory constraints of computing systems of the era.

The Atari 2600 (Wikimedia)

“I worked on it in secret,” Robinett told me. “I produced what you could call a feasibility demonstration, and I showed it to some of the other people at Atari. If [my boss] had been a more powerful person, he would have just crushed me. Adventure could have easily not happened if things had been slightly different.”

Adventure’s enormous impact on the future of gaming—and, arguably, on the broader visual aesthetic of that would come to dominate the web—is difficult to quantify. “I realize that to somebody who grew up after adventure games became commonplace, that it seems like Adventure was probably handed down from a mountain on a stone tablet, but it wasn’t,” Robinett said. “I was lucky that I was in the right place at the right time, and got to make one of those little innovations that had an impact and changed the way we do things.”

But technological progress has a way of seeming inevitable in retrospect. From the player’s perspective, the culture that defined the early web seemed similarly preordained—like a natural extension of the dynamics that shaped virtual spaces in offline gaming.

A screenshot from the “Castle Creator” mode of Think Quick! (Internet Archive)

I grew up playing Think Quick, for example, a game released in 1987 by The Learning Company, which Robinett co-founded after his time at Atari. The game involved exploring the rooms of a castle; opening trap doors, finding secret objects, evading the dreaded slime worms, and eventually ousting a dragon. It also offered a mode called Castle Creator—think of it as a kind of predecessor for creative mode in Minecraft—that let players create their own levels. You could do everything from building the layout of the rooms to designing icons for hidden objects. Years later, when Romero created Doom, he took the idea of game modification a step further. Players could build their own levels, and in some cases entirely new games, using the game files that Doom made accessible to players.

“Just the way there had been this interesting coincidence with Apple and Nintendo [introducing] these closed systems in 1984, there was another interesting coincidence that happens about 10 years later, and that coincidence is the launch of Doom and the web,” Lowood, the Stanford historian, told me. “Because Doom, the way it was architected, could be modified—that has been looked at by people in the technology community as the beginning of open source. This sort of open culture reemerged. It’s a key thing—open versus closed, sharing versus design-curated—and these tensions are really important parts of the history of computing.”

The cultural standing of video games, meanwhile, has only grown in the mobile Internet age. Even during the economic recession, the video game sector in the United States grew 9.6 percent and added $6.2 billion to the economy—outpacing the entire national economy four times over, according to CNET.

“I don’t think its too much of an exaggeration to say games are everywhere,” Lowood said. “There are game-like systems built into a lot of parts of our lives. Games have to be seen not just as the specific play-experience, but also a part of the story of the impact of technology on our lives, a part of the story of different ways of learning, different ways of doing business. All of these changes that have occurred as a result of technology, games are a big part of those stories.”