“We work hard to make Facebook as safe as possible while enabling free speech,” said Monika Bickert, Facebook’s Head of Global Policy Management. “This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”
Let’s stipulate that these are difficult decisions on an individual basis. And let’s further stipulate that multiplying the problem by 2-billion users makes the task daunting, even for a company with $7 billion on hand. Facebook has committed to adding 3,000 more content moderators to the 4,500 working for the company today.
But is Facebook’s current approach to content moderation built on a firm foundation? The company’s approach to content moderation risks abdicating the responsibility that the world’s most popular platform needs to take on.
“When millions of people get together to share things that are important to them, sometimes these discussions and posts include controversial topics and content,” we read in the training document. “We believe this online dialogue mirrors the exchange of ideas and opinions that happens throughout people’s lives offline, in conversations at home, at work, in cafes and in classrooms.”
In other words, Facebook holds that the posts on its platform reflect offline realities and are merely a reflection of what is, rather than a causal factor in making things come to be.
Facebook must accept the reality that it has changed how people talk to each other. When we have conversations “at home, at work, in cafes, and in classrooms,” there is not an elaborate scoring methodology that determines whose voice will be the loudest. Russian trolls aren’t interjecting disinformation. My visibility to my family is not dependent on the quantifiable engagement that my statements generate. Every word that I utter or picture that I like is not being used to target advertisements (including many from media companies and political actors) at me.
The platform’s own dynamics are a huge part of what gets posted to the platform. They are less a “mirror” of social dynamics than an engine driving them to greater intensity, with unpredictable consequences.
Facebook’s Mark Zuckerberg seemed to acknowledge this in his epic manifesto about the kind of community that he wanted Facebook to build.
“For the past decade, Facebook has focused on connecting friends and families,” he wrote. “With that foundation, our next focus will be developing the social infrastructure for community—for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all.”
To get this “social infrastructure for community” right, Facebook has to acknowledge that it has not merely “connected friends and families." It has changed their very nature.