According to its Friday blog post, it does this with only enough human oversight to prevent inaccurate stories from trending. Jonathan Zittrain, a professor of law and computer science at Harvard University, likened Facebook’s decision to use impersonal aglorithms to “confining things to the roulette wheel.”
“Even the casino isn’t supposed to know what number is going to win when it spins. And so, if there’s some issue, at least it isn’t intentional manipulation by Facebook,” he told me. Google claims the same innocence with its search results, he said.
But the murkiness of machine learning algorithms—it can sometimes become impossible to tell exactly why a program is making decisions—makes this kind of editorial abdication increasingly difficult to claim, he added. If an algorithm’s writers tune it to even slightly favor certain outcomes, then the program itself may start to regress to regrettable habits.
“The algorithm gets smart enough that, even if the casino isn’t looking to put a thumb on the scale, thumbs will appear,” he said. “This isn’t just Facebook’s problem—this is one of the profound problems of our time.”
In the case of Trending, Facebook said its human veracity-checkers messed up. But the company has an interest in keeping other people out of the picture: By firing anyone who made any other kind of editorial judgement, the company can assert that it remains only a technology company and not a media company. This is a rhetorical move with an ominous history: As Buzzfeed’s Charlie Warzel writes, the same claim allowed Twitter to ignore its culture of harassment, which now poses a major business threat to the company; and it permitted Uber to gain scale and skirt municipal oversight during its early years of explosive growth. (Facebook has hired journalists—and then thought better of it and dismissed them—before.)
But this prompts a second question: Even if algorithms are now running the show, is Facebook legally responsible for what happened over the weekend?
Let’s review the episode. For at least eight hours, Facebook promoted the topic “Megyn Kelly” because of a bogus but massively popular article claiming that Kelly, a Fox News anchor, had been fired from the network because she endorsed Hillary Clinton. It was totally wrong: She hadn’t, and she hasn’t.
Yet “the Trending review team accepted it thinking it was a real-world topic,” says Osofsky. It is unclear how this happened: The article was published by endingthefed.com, which is not a mainstream or particularly popular conservative outlet. Among the stories on its front page right now: “German Scientists Prove There is Life After Death.”
Thanks to Facebook’s help, the Kelly fabrication eventually racked up more than 200,000 likes. But here is a chicken-and-the-egg problem: As soon as a story starts “trending,” even if only several thousand people are talking about it, it immediately appears in front of millions of eyeballs. This brings it a lot of attention that it would otherwise never receive—especially now that Trending seems to favor certain URLs and not certain generic topics. (On Facebook’s desktop site, Trending Topics is featured in a righthand sidebar; on its mobile app, they populate after the user taps the search bar.)