Facebook stuck to its guns, while YouTube eventually “demonetized” Crowder’s account. Both decisions were mocked and defended, crystallizing just how disputed the terrain of content moderation on platforms has become. The idea of “a platform” doesn’t make sense anymore, and it’s being challenged from every direction.
Beyond the Maza-Crowder dispute, just in the past week, researchers demonstrated that YouTube’s algorithm seemed to lead users to sexualized videos of children, and The New York Times ran a front-page story about how a young man was radicalized by right-wing videos on the site.
To defend themselves, the so-called platforms have developed byzantine sets of rules. If they follow the guidelines they make up, they say, they are fulfilling their obligations to their various kinds of users. This week, YouTube’s CEO, Susan Wojcicki, tried to explain her company’s actions at the Code Conference. She mentioned the word policies 14 times. “We need to have consistent policies,” she said. “They need to be enforced in a consistent way. We have thousands of reviewers across the globe. We need to make sure that we’re providing consistency.” Of course, the policies are always changing and can be revisited at any time, and yet these inconsistent rules will be enforced consistently. It’s a mess.
Even simple brand-promotion moments have become complicated. YouTube’s annual “Rewind” video, which the brand uses as a showcase for its stars, has become a battleground about what YouTube is. When last year’s version left out old-school YouTubers such as the controversial PewDiePie, it became the most disliked video in the site’s history. Fans who saw themselves as part of the “real” YouTube community panned YouTube for catering to more advertising-friendly, professional creators. “The community, which was once celebrated by YouTube, no longer feels included in the culture YouTube wants to promote,” The Verge summarized.
Many disputes—about “community,” white-supremacist content, online harassment, or the supposed liberal bias of these services—devolve into the carefully massaged language of some policy team. Perhaps the labor model of content moderation is questioned, or the accuracy or ethics of particular algorithmic decisions.
Don’t let the minutiae distract from what’s really happening: An era-defining way of thinking about the internet—“the platform”—has become unstable.
There was a time when there were no “platforms” as we now know them. That time was, oh, about 2007. For decades, computing (video games included) had had this term “platform.” As the 2000s began, Tim O’Reilly and John Battelle proposed “the web as a platform,” primarily focusing on the ability of different services to connect to one another.
The venture capitalist Marc Andreessen, then the CEO of the also-ran social network Ning, blasted anyone who wanted to extend the definition. “A ‘platform’ is a system that can be programmed and therefore customized by outside developers,” he wrote. “The key term in the definition of platform is ‘programmed.’ If you can program it, then it’s a platform. If you can’t, then it’s not.” My colleague Ian Bogost, who co-created an MIT book series called Platform Studies, agreed, as did most people in the technical community. Platforms were about being able to run code in someone else’s system.