Vowing to select members from “the widest possible set of diverse candidates from outside our normal channels,” Facebook itself will appoint by the end of the year the initial members, who will then help select the remaining members until the board reaches its full capacity of 40 people. The hope is that by appointing members for fixed terms with protection against removal for any reason except breach of a yet-to-be-announced code of conduct, the board will make decisions without regard to public opinion or Facebook’s business interests. Importantly, it will also be seen to be doing so.
The board’s decisions might not be any better in substance than the ones Facebook has been making already. Often, there is no “right” answer. And to hope that this body will finally resolve intractable disputes about the proper limits of free speech that have bedeviled lawyers and philosophers for centuries, well before all the added challenges of the digital age, is to misunderstand the board’s purpose.
What the board can do is explain the reasoning behind any particular decision or rule. This process of transparent, public reasoning is the main way people in a pluralistic community can come to view the rules that they have to abide by as legitimate. Research shows that people’s feelings about whether a decision is legitimate depend more on the process for reaching the decision—and, crucially, whether the decision is explained by reference to neutral, generally applicable rules—than on whether they agree with the choice.
This emphasis on explanation is seen throughout the charter. In nine pages, the charter repeatedly refers to the responsibility of the board to make decisions that are “explained clearly” “using clearly articulated reasoning” and “plain language.” The charter also instructs the board to give substantial weight to its own prior decisions—like common-law legal systems do—in deciding any case. This all amounts to the idea that the board’s decisions should be based on something more than mere gut feeling or personal opinion. That’s why it was a little simplistic for Zuckerberg to write in his letter yesterday that “just as our Board of Directors keeps Facebook accountable to our shareholders, we believe the Oversight Board can do the same for our community.” While the board should have regard to community interest, it should not merely be a proxy for public sentiment—otherwise Facebook could just decide these issues by poll. Decisions need to be based on something more fundamental.
This is the role of Facebook’s “values.” So while the new board and its charter are getting all the attention, the values that Facebook published last week are just as consequential. These values, like the charter itself, emphasize Facebook’s commitment to “voice,” but note that this needs to be balanced against the need to respect authenticity, safety, privacy, and dignity. Importantly, these documents expressly incorporate international human-rights norms as informing the board’s decision making. This grounding in a more widely endorsed set of principles is another bid for broader acceptance of the new rules.