At the start of every school year, Diane Levin, an education professor at Boston’s Wheelock College who teaches a course called “Meaning and Development of Play,” has her students interview people of different ages about how they used to play when they were children. The results are not surprising: Every year, her students report that interview subjects over age 50 played outside all day in big groups of their peers, with a few toys (“maybe a ball”) and no adult supervision. People between the ages of 20 and 40, who grew up in the 1980s, ‘90s, and early 2000s, watched a lot of television but still played outside, often make-believe games inspired by TV shows and movies.
For young people today, however, it’s a different story. “They hardly play. If they do play it’s some TV script. Very prescribed,” Levin said. “Even if they have friends over, it’s often playing video games.”
That was before Pokémon Go, though.
The augmented-reality (AR) game that—since its release on July 6, attracted 21 million users and became one of the most successful mobile apps ever—has been praised for promoting exercise, facilitating social interactions, sparking new interest in local landmarks, and more. Education writers and experts have weighed in on its implications for teaching kids everything from social skills to geography to the point that such coverage has become cliché. And while it seems clear at this point that the game is a fad that has peaked—it’s been losing active players for over a week—one of the game’s biggest triumphs has, arguably, been the hope it’s generated about the future of play. While electronic games have traditionally caused kids to retreat to couches, here is one that did precisely the opposite.