Monika Bickert is a serious, impressive person. Before she became Facebook’s head of global policy management, she put her Harvard law degree to work as an assistant U.S. attorney going after corrupt government officials.
On February 2, Bickert spoke very intentionally and precisely about how Facebook’s content-management team and policies are constructed at the Santa Clara University School of Law’s Content Moderation and Removal at Scale conference, organized by Eric Goldman, the director of the school’s High-Tech Law Institute.
Bickert emphasized that humans are deeply necessary to the project of content moderation, saying that Facebook now has 7,500 content moderators around the world, meeting the hiring goal Mark Zuckerberg set in May of 2017, when the company only had 4,500 content moderators. In other words, they’ve added almost the same number of content moderators as Twitter or Snapchat’s total employee head count in the last eight months.
And they’re not hiring most of those people in Silicon Valley.
“Content reviewers tend to be hired for their language expertise, and they don’t tend to come with any predetermined subject-matter expertise. Mostly they are hired, they come in, and they learn all of the Facebook policies, and then over time, they develop an expertise in one area,” she said. “The review team is structured in such a way that we can provide 24/7 coverage around the globe. That means that we often are trying to hire a Burmese speaker in Dublin, or come up with other ways of staffing languages so that the content can be reviewed or responded to within 24 hours. That’s our goal. We don’t always hit it.”