Reuters

Mark Zuckerberg announced Friday that Facebook will begin surveying users about which news sources they trust, in an effort to rank publications on “trustworthiness.” This rating will help determine media companies’ placement in the News Feed, thereby materially changing the traffic that their stories receive.

Zuckerberg provided only a cursory description of the survey methodology. “As part of our ongoing quality surveys, we will now ask people whether they’re familiar with a news source and, if so, whether they trust that source,” he wrote in a Facebook post. “The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don’t follow them directly.”

Given that the implementation of these surveys will change which news brands prosper, this is a very limited explanation of how this all might work. One thing we know for sure: Facebook has decided that, at an institutional level, it will not create an editorial process for rating these publications. “The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division,” Zuckerberg wrote. “We could try to make that decision ourselves, but that’s not something we’re comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem.”

The approach raises many important questions about the implementation of the new policy. Here are five that jump out:

  1. Will user surveys be the sole determinant of a publication’s authoritativeness?

From the description of the survey, it seems as if this approach may be able to sniff out the magnitude of a publication’s ideological commitments. But publications with similar ideologies can have very different editorial standards and resources. National Review and Mother Jones might be partisan publications, but both maintain a level of rigor that should factor into their authoritativeness in any reasonable system. Will it?

  1. Who has designed the survey? Were outside experts brought into the fold? Are they relying on existing, similar surveys?

Facebook may have a bevy of experts working on this. It may be that they have dozens of sociologists advising them on how to create the best possible survey. But we don’t know who will be responsible for this yet. This should be a public matter, given its import to the public sphere.

In particular, there are two surveys that might show us what the results could look like, one from the Reynolds Journalism Institute in 2017 and the other from the Pew Research Center in 2014. Here’s what the Reynolds “trust” rankings looked like:

Reynolds Journalism Institute

Were the rankings that Facebook devises to look even remotely similar, and were they to affect every part of the “trust” distribution, then some very prominent companies could suffer, most obviously BuzzFeed and Breitbart.

It’s also unclear if companies like Vice, BuzzFeed, and The Wall Street Journal, all of which create news and other types of content, would find all of their outputs affected by the rankings, or merely the news. If the latter, then Facebook will need some means of discriminating between different content types or requiring companies to label what they’re publishing.

  1. Will there be a human review process? Who will be a part of it?

It seems like this system cannot be truly and fully automated. There must be some humans at Facebook in the loop who at least spot-check the data. Does Facebook have people who could do this on staff already? What if they find problems with the rankings?

In particular, what kind of steps has Facebook taken to ensure that this kind of survey can’t be manipulated en masse by some subset of users to tank the rating of news outlets that they don’t like? Surely—hopefully—they have considered this possibility.

  1. Will media organizations be able to know and contest their ratings?

Media organizations that are heavily reliant on Facebook traffic will want to stay in the good graces of the system’s rankings. But it’s possible that the rankings will not be public or even known to the companies themselves. Based on prior experience with the company’s practices, it does not seem likely that Facebook will create a process that allows media organizations to contest their ranking, but an appeals process would make sense.

  1. Will Facebook link their guesses about users’ ideologies to their ratings of news sources?

Facebook alone possesses an intriguing ability: It could cross-reference what it already knows about users with the ways that they answer the surveys. Perhaps this knowledge of its users could be used to account for potentially biased information.

For example, if it found that conservative users rate liberal publications substantially less trustworthy than comparably liberal users rate conservative publications, they might be able to calibrate these surveys. That said, this would raise new and novel questions.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.