Against the Video-Game Canon
It’s impossible to play every great new title, but in a time when independent options abound, fans should embrace the array of choices.
It all started with that one spreadsheet: a trifle to amuse myself between bouts of frenzied editing and marathon writing sessions. It was called “the Database”—an inside joke shared by me and no one else—and it was my attempt at listing every video game I had ever played in my life.
At first, I thought it insurmountable; the kind of self-ordained project that I would sink a few careless hours into before consigning to the same dustbin shared by my teenage projects. After all, I had spent most of my life playing video games, or at least what felt like it. How could I possibly account for all those hours, days, months that now congeal in my memory like raindrops on glass?
But it wasn’t insurmountable. I would hesitate to even call it difficult. It took a few nights of sifting through wikis, databases, and old cardboard boxes, but soon enough I came to something resembling to a final approximation. And it was just a small fraction of the estimate that had rattled in my head since I first started delving. Frantically, I searched, and scrounged, and scavenged. And though the count nudged northward ever so slightly, my disposition remained the same. All those worlds conquered, castles plundered, tales eternally retold—when tallied up into one small sum, they no longer seem larger than life. Rather, our lives seem smaller than them.
Some may find this sort of ludic anxiety odd, even inexplicable. People engage with games for many different reasons, but for most of us, the titles that line our shelves are there chiefly to amuse us; surely wondering “am I playing the right games?” constitutes a contradiction of the first order. Despite this, such thoughts seem to strike many players with alarming regularity—that is, if message boards, chatrooms, and other such net ephemera are to be trusted. Even the salaried critics, arguably the group best-situated to fully burrow into gaming’s ample trove, lament the wealth of titles left unplayed at every turn. The phenomena, it seems, is endemic. Worse still, this question can beget other, more latent anxieties. If one had just found more time to play, would the dreaded backlog stack quite so high? And then there’s the matter of the games one did get around to—were they explored as fully as one could muster, or is there some deep dungeon worth the hours it would take to plumb its final depths?
These are unfair questions, of course; as useless to the asker as they are unanswerable. However, like most unfair questions, they sometimes have the power to reveal something about the people they vex so utterly. Here, the three entwine to form a familiar inquiry—one that lurks in the minds of both enthusiasts and newcomers alike, and that those in the community cannot begin to pry themselves from—“am I playing wrong?”
Despite its prevalence, this brand of negative thinking is hardly specific to gaming. The choice-focused aesthetics of some interactive media certainly exacerbate it, but the phenomena has its roots not in any sort of media discipline, but in the “soft” science of behavioral economics, a psychology-heavy sub-field focusing on how people make decisions in a controlled environment rather than the chaos of the open market. In 1979, Daniel Kahneman, one of the field’s pioneering minds, authored a paper that first identified the concept that would later be called the “sunk cost fallacy.” Eschewing the traditional model, which posits absolute rationality as the linchpin of economic decisions, Kahneman’s work suggests that the average person is far more sensitive to a loss of one dollar than they are to a gain of the same amount. Or, put forth more broadly: People have a tendency to be far more concerned with perceived loss than their actual gain. Perhaps that’s why some of us struggle for the laundry list of titles we touched in the past year while never ever forgetting that one gem we missed—in my case, The Witcher 3 (2015).
While bouts of this sort of self-flagellation crowd comments sections and forums the Internet over, perhaps the most visible examples of it occur at the end of the year, when small-time blogs and big sites alike convene their award jamborees, dedicated to honoring their perceived best games of the year. And though these events can produce some surprising results, more often than not editors will find themselves settling on the heavy favorite simply because it’s the only one everyone actually got around to. That’s not to say that they should be expected to play every “notable” game that comes out—such a demand would be unreasonable—but when a system so clearly privileges the sort of big-budget megagame that dominates the press cycle to the detriment of smaller, perhaps more deserving games, it’s probably time to consider what exactly we intend for it to do.
What’s so beguiling about the continued tradition of the institutional top-10 list is that it feeds and continues the same cycle of anxiety and doubt that many players find themselves falling into. With few exceptions, these lists compound upon each other to present the whole of gamedom as a set of tiers delineated by opaque criteria—an elite one you must play, an abysmal one you must laugh at, and nothing but forgettable slush in-between. Yet, despite the recent explosion of disparate and challenging independent games onto the scene, that “elite tier” is still populated with the same crop of blockbusters that strive so hard to say nothing at the loudest possible volume. This annual tradition apparently constructed to showcase the vibrancy of the medium now seems a testament to Kahneman’s sunk cost—an army of editors shrugging at the masses of games overlooked.
Perhaps this is too harsh; even in this time of transition, it’s easy to feel as if the changes that one would like to see aren’t happening nearly quickly enough. And it’s not the list-making impulse in itself we should find fault with, as it has existed for far longer than any of us. The idea of such a “canon” dates back to at least Aristotle. The concept seems sound enough. But while such lists can sometimes serve as useful guides to the uninitiated, their construction is often rife with the worst trends in media, such as elitism and unapologetic bigotry of all stripes. Such rampant gatekeeping has produced a backlash against the concept, with some labeling it the domain of the “deadest whitest men.”
Still, even firm advocates of this canonizing impulse can recognize the limitations of the current approach. By definition, these lists bind themselves to the cultural now, allowing one-time classics like Legacy of Kain: Soul Reaver (1999) and Klonoa 2 (2001) to slip into the cultural void. The more comprehensive catalogs rarely do much better, reproducing the same biases and blindspots of the top-10 lists they so rely upon. No matter the crop of games, some always wield hyperbole like a weapon, taking care to only strike the title with the most cultural currency. And, worst of all, this focus on the present overall allows us to turn a blind eye to the same systemic faults that rear their ugly heads year after year.
For now, this is what it means to be well-played—to doggedly cling to a list of mass-approved titles included more for their historical value than their actual quality; to accept the latent mechanical hegemony assumed by the critics of yesteryear; to allow oneself to be swept up in the anxiety and secondhand hype for the sake of someone else’s curricula. No one guided us to this definition. No, we were driven to it, by the same creeping insecurity that shook me as I desperately tried to quantify my own sunk cost.
Regardless of whether or not we admit it, there is no denying that the world of video games has left these archaic constructs behind. What was once niche now tops the Steam sales charts, and what was once sub-niche can still draw enough to support a modest Patreon. Game makers like Increpare and Christine Love continue to push the boundaries of the medium, sparking the sorts of messy conversations that help both creators and critics alike come to terms with their own aesthetics. The standardized measuring stick that critics once used to measure every game against another has shattered, taking a wealth of old assumptions with it. And while it’s easy to nod our heads in acknowledgement of it, altering the course of this rudderless ship has proved to be a task more laborious than any would have thought. We call this a time of opportunity, but few seem keen on leading it. So, if there’s nothing keeping you here, jump out and swim toward your own destination. Where you’re going, you don’t need a map.
This post appears courtesy of Kill Screen.