As the holiday sales season reaches its peak this week, video games are proving themselves, as always, as some of the biggest winners. When the new video game World of Warcraft: Cataclysm was released last week, it sold 3.3 million copies in the first 24 hours it was on the market. That feat comes only a week after Activision Blizzard (the company that produces the World of Warcraft game) set a five-day sales record of $650 million with its newest version of the popular Call of Duty first-person shooter video game. Not that game popularity is limited to the holiday season. A silly application game called Angry Birds (in which players catapult cartoon birds at fortresses built by pigs who've stolen the birds' eggs) has been downloaded by over 50 million people over the past year.
What's more, the time people spend playing all those games is increasing. The 12 million subscribers to World of Warcraft spend an accumulated 200 million hours, or so, a week engrossed in the game. And according to the Kaiser Family Foundation, the amount of time children spend playing video games has almost doubled in the past 10 years. By the time teenagers today reach the age of 20, they will have spent an average of 10,000 hours playing video games -- or the equivalent of five working years.
All of which is to say: Video games are huge. Very huge. And getting bigger. Exactly what one thinks about this development depends partly on how one views the possibilities or evils of technology. But a growing number of people are trying to figure out if or how other parts of life, from school to exercise to work to household chores, could be structured to capture the same kind of attention, energy focus, and potential addiction that video games inspire.
On one level, the results reveal some fascinating -- if slightly embarrassing -- facts about human responses and behaviors. For all of our advanced mental capacity, it appears we respond to stimuli very much like lab rats.
In an article in last week's Science Times John Tierney said a crucial element to the appeal of video games was the fact that they provided "instantaneous feedback and continual encouragement ... while also providing occasional unexpected rewards." That assessment mirrors the research results of Dr. Paul Howard-Jones a British neuroscientist who has found that (as a Times article reported earlier this fall), "children's engagement levels are higher when they are anticipating a reward but cannot predict whether they will get it."
As anyone who's ever taken college psychology knows, lab rats too will continue to hit a lever if they keep getting rewarded, but will stick with the task far more persistently if the rewards are unpredictable. I'd love to be able to argue that human motivations are more complicated than that (and, indeed, in some or many areas of behavior, they are), but the evidence for how easily we get transfixed by little bells, lights, and other sensory reward pellets is hard to ignore.
There is no task on a video game -- not even repetitive zapping of evil gnomes -- that is anywhere near as unpleasant as cleaning a toilet.
In addition to video games, there is, for example, the increasing number and popularity of individual activity tracking devices and websites that allow people to record, upload and share (and in some cases get rewarded with charity donations for) their exercise during the day: FitBit, Nike Plus, Run Keeper, the Garmin FR 60, FitDay, FitWatch, MapMyRun, FitTracker ... the list goes on and on, even though the phenomenon puzzles me just a bit.
Really? We're more motivated to exercise if a little gizmo lets us track each step, and we can see cool little trend lines or achieve new levels on a computer screen? Apparently. Come to think of it, running on a treadmill in pursuit of those disappearing dot-lines of hills and distance conquered doesn't, on the surface, differ all that much from the hamsters who run on an exercise wheel in pursuit of ... well, whatever it is they're in pursuit of.
Not that our connection with rodents in the behavior/motivation category is all that surprising. There's a reason, after all, that researchers watch rat behavior in an effort to understand humans. But the comparisons have their limits. Even if it's true that we respond to immediate feedback, constant encouragement, and unexpected rewards, how transferable, really, is the video game framework to real life?
A New York Times Magazine article this fall profiled "Quest to Learn," an experimental school in New York City that uses a video game format as a primary framework for teaching. Instead of grades, students achieve levels of experience. Assignments are gaming problems, or "quests" that often require the application of multi-disciplinary skills (English, history and math, for example) to complete. In some cases, the assignments involves the design of a video game, itself.
On some levels, using a video game approach to learning a subject makes a lot of sense, and might work well with kids. One of the big complaints kids have about school is that they can't see how or why what they're learning applies to their lives. Creating a game task that requires an algorithm to solve it, or English skills to describe a story line, or enough history knowledge to make a game realistic, provides context and illustrates the relevance for individual skills. And changing the emphasis from grades to knowledge is far more appropriate, in the big scheme of things. After all, it's not your grades that matter. It's the knowledge, skills, and understanding you acquire and retain after the test is over.
But what about adults? Can employers make working for a defense contractor a larger-scale version of Star Wars: The Old Republic? And can a website that offers video game rewards for doing real-world tasks like cleaning the toilets really motivate adults to do unpleasant tasks they otherwise would avoid? (Seriously -- there's a website called ChoreWars that offers this service. Some testimonials on the site swear it helps, even though one reviewer called it one of the "Dumbest Start-Ups of 2007" when it was released.)
For all the appeal of gaming, I'm skeptical. Even if we respond well to silly little rewards or birds smashing against castle walls, video games do not operate like life. On several levels. For one thing, there is no task on a video game -- not even repetitive zapping of evil gnomes, forging sword after sword after sword to earn cash credits, or even the endless running, running, running to get from point A to all other points, that is anywhere near as unpleasant as cleaning a toilet. Video games are, relatively speaking, all fun. Life is not always all fun.
Second, video games are linear. Your character proceeds along a task, then a sub-task, and perhaps spends some time on a diversionary point or resource-improving side story line. But each task requires singular focus, and is a fairly straightforward challenge. Real life is multi-dimensional, multi-task, and, in most cases, hugely complex. The problem of defeating a virtual wizard or conquering a town in a game is a tough, but straightforward, task. The problem of figuring out how to win over a big client who's unhappy about your team, while your factory is experiencing technical problems and delays, the competition is releasing a similar product and one of your executives is revealed as skimming from the expense till ... is a problem less geared toward immediate feedback and constant encouragement, or easy-to-assess level achievements.
Third, there is no such thing as dismal failure in a video game. Educators at Quest to Learn are trying to replace the sense of failure in kids with a more aspirational "haven't succeeded yet." It works in video games -- you get killed 20 times, and finally defeat the opponent when you figure the game out well enough or develop better playing skills. It might also work in school. And even, on a large, philosophical level, life. But in a job, where you are paid a certain amount to perform a certain task, the "oh, well, you failed for six months, keep trying" approach is not realistic. In the real world, you are expected to perform at a certain level. It's not about learning. It's about doing. And both the possibility of real failure, and real consequences for that failure, is, well ... real.
We may love video games because they appeal to our craving for entertainment, reward and the possibility of achieving control over some world, even if it's not the real one. And we may be closer to rats than we'd like to admit, in terms of how easily we're motivated to perform repetitive activities in pursuit of small "pellet" rewards. Especially when they come in electronic form.
But there's still a difference between virtual reality and reality itself, and limits to how far the game analogy goes. That may be unfortunate, from the perspective of employers wishing to better motivate their employees. But on the other hand, no virtual reward is as sweet as those that come from real experience, real risk and real achievement. And those achievements are generally driven not by artificial frameworks, but by someone caring deeply about accomplishing the challenge at hand. Explaining a goal in terms of the compelling mission or quest it is furthering can certainly help create that motivation in employees or team members. The military relies on it, in fact. But engendering that kind of motivation often comes from a team's sense that their task is anything but a game. And that takes more than bells and whistles to sustain.