Some may find this sort of ludic anxiety odd, even inexplicable. People engage with games for many different reasons, but for most of us, the titles that line our shelves are there chiefly to amuse us; surely wondering “am I playing the right games?” constitutes a contradiction of the first order. Despite this, such thoughts seem to strike many players with alarming regularity—that is, if message boards, chatrooms, and other such net ephemera are to be trusted. Even the salaried critics, arguably the group best-situated to fully burrow into gaming’s ample trove, lament the wealth of titles left unplayed at every turn. The phenomena, it seems, is endemic. Worse still, this question can beget other, more latent anxieties. If one had just found more time to play, would the dreaded backlog stack quite so high? And then there’s the matter of the games one did get around to—were they explored as fully as one could muster, or is there some deep dungeon worth the hours it would take to plumb its final depths?
These are unfair questions, of course; as useless to the asker as they are unanswerable. However, like most unfair questions, they sometimes have the power to reveal something about the people they vex so utterly. Here, the three entwine to form a familiar inquiry—one that lurks in the minds of both enthusiasts and newcomers alike, and that those in the community cannot begin to pry themselves from—“am I playing wrong?”
Despite its prevalence, this brand of negative thinking is hardly specific to gaming. The choice-focused aesthetics of some interactive media certainly exacerbate it, but the phenomena has its roots not in any sort of media discipline, but in the “soft” science of behavioral economics, a psychology-heavy sub-field focusing on how people make decisions in a controlled environment rather than the chaos of the open market. In 1979, Daniel Kahneman, one of the field’s pioneering minds, authored a paper that first identified the concept that would later be called the “sunk cost fallacy.” Eschewing the traditional model, which posits absolute rationality as the linchpin of economic decisions, Kahneman’s work suggests that the average person is far more sensitive to a loss of one dollar than they are to a gain of the same amount. Or, put forth more broadly: People have a tendency to be far more concerned with perceived loss than their actual gain. Perhaps that’s why some of us struggle for the laundry list of titles we touched in the past year while never ever forgetting that one gem we missed—in my case, The Witcher 3 (2015).
While bouts of this sort of self-flagellation crowd comments sections and forums the Internet over, perhaps the most visible examples of it occur at the end of the year, when small-time blogs and big sites alike convene their award jamborees, dedicated to honoring their perceived best games of the year. And though these events can produce some surprising results, more often than not editors will find themselves settling on the heavy favorite simply because it’s the only one everyone actually got around to. That’s not to say that they should be expected to play every “notable” game that comes out—such a demand would be unreasonable—but when a system so clearly privileges the sort of big-budget megagame that dominates the press cycle to the detriment of smaller, perhaps more deserving games, it’s probably time to consider what exactly we intend for it to do.