Facebook has been weathering a series of disapproving news cycles after clarifying that its disinformation policies exempt political ads from review for truthfulness. There are now reports that the company is considering reducing the targeting options available to political advertisers.
No matter how Facebook and its counterparts tweak their policies, whatever these companies do will prompt broad anxiety and disapprobation among experts and their own users. That’s because there are two fundamental problems underlying the debate. First, we the public don’t agree on what we want. And second, we don’t trust anyone to give it to us. At one moment someone can reasonably question how Facebook thinks itself king, deciding what each of its 2.4 billion users can see or post—and at another ask how Facebook can make a profit of $22 billion a year from distributing others’ content, and yet take no consistent responsibility for what’s inside it.
What we need are ways for decisions about content to be made, as they inevitably must be when platforms rank and recommend content for us to see; for those decisions yet not to be too far-reaching or stiflingly consistent, so there is play in the joints; and for the deep stakes of those decisions to be matched by the gravity and reflectiveness of the process to make them. Facebook recently announced plans for an “independent oversight board,” a tribunal that would render the company’s final judgment on whether a disputed posting should be taken down. But far more than its own version of the Supreme Court, Facebook needs a way to tap into the everyday common sense of regular people. Even Facebook does not trust Facebook to decide unilaterally which ads are false and misleading. So if the ads are to be weighed at all, someone else has to render judgment.