But perhaps you fall into a third category. You’re intensely suspicious of the mainstream media, which you think is largely controlled by liberals, or corporations, or unseen billionaires. Maybe you think the media has twisted the “fake news” label to apply it to stories they just don’t like, or want to discredit.
For people in this last category, Facebook’s warning may be infuriating. They may think this fact-checking is a way to stifle some of their favorite sites, and they might even see it as an attack on good old American free speech. If some people see the warning in this light, it loses power. Some may tell their friends to ignore the warning and read the story anyway. (If they want to post the same story on their own Facebook page, though, they'll have to click through another alert that reads: “Before you share this story, you might want to know that independent fact-checkers disputed its accuracy.”)
It’s not totally clear how many internet users fall into each of these categories, but a report from Pew Research Center released Thursday offers some hints. According to Pew, 64 percent of Americans think fake news creates “a great deal of confusion” around basic facts. A further 24 percent think it creates “some confusion,” and 11 percent say it creates “not much” nor no confusion at all.
That 11 percent of people might bristle to see Facebook labeling stories as fake, if they don’t believe fake news is a problem at all. (But since Pew didn’t tightly define what “fake news” is, some people who think that reputable sources like The New York Times peddle fake news may be counted among those saying fake news causes confusion.)
There’s a danger that people who are disinclined to trust traditional sources of information will treat Facebook’s warnings as a badge of honor. If fact-checking organizations deem a story questionable, they might be more likely to read and share it, rather than less. There's reason to believe this group might think of itself as a counterculture, and take the position that anything that “the man” rejects must have a grain of subversive truth to it.
But Facebook’s new features will make it difficult for fake news to travel far. People will easily and anonymously be able to report a link as fake, flagging it for review by fact-checkers. Stories marked as “disputed” will be be less likely to show up in others’ news feeds, and can’t be promoted or made into an ad. The company is even experimenting with monitoring how users interact with story to determine whether or not the story is legitimate: If people who read a certain story are much less likely to share it, Facebook says, that might be a clue that the story is misleading.
Apart from the plan to monitor sharing habits, all the new features Facebook announced Thursday are driven by humans. The company has had a fraught relationship with using human editors to moderate content on its site. In August, it fired all the editors who were in charge of the “trending” module found in the top-right corner of the site after the site was accused of suppressing conservative news. But the inability of machines to match human judgement became immediately clear, when the top trending story the following week was factually inaccurate.