Content moderators review the the dark side of the internet. They don’t escape unscathed.
Lurking inside every website or app that relies on “user-generated content”—so, Facebook, YouTube, Twitter, Instagram, Pinterest, among others—there is a hidden kind of labor, without which these sites would not be viable businesses. Content moderation was once generally a volunteer activity, something people took on because they were embedded in communities that they wanted to maintain.
But as social media grew up, so did moderation. It became what the University of California, Los Angeles, scholar Sarah T. Roberts calls, “commercial content moderation,” a form of paid labor that requires people to review posts—pictures, videos, text—very quickly and at scale.
Roberts has been studying the labor of content moderation for most of a decade, ever since she saw a newspaper clipping about a small company in the Midwest that took on outsourced moderation work.