Facebook's desire for efficiency means democracy is out and technocratic, developer-king rule is in.
Let's stipulate that Facebook is not a country, that real governments fulfill many more functions, and that people are not citizens of their social networks.
Nonetheless, 900 million human beings do something like live in the blue-and-white virtual space of the world's largest structured web of people. And those people get into disputes that they expect to be adjudicated. They have this expectation in part because Facebook has long said it wants to create a safe environment for connecting with other people. (How else can you get people to be "more open and connected"?) But people also want someone to be in charge, they want an authority to whom they can appeal if some other person is being a jerk.
Except in this case, the someone really is a corporate person. So when you report something or someone reports something of yours, it is Facebook that makes the decision about what's been posted, even if we know that somewhere down the line, some human being has to embody the corporate we, if only for long enough to click a button.
Any individual decision made by Facebook's team -- like taking down this photo of a gay couple kissing -- is easy to question. Ars Technica's Ken Fisher detailed a whole bunch of one-off problems that people have encountered with Facebook's reporting system. In each, there is an aggrieved party, but we're only hearing one side of the conflict when these problems bubble up. Across many single events, you have two people (or entities like businesses) with conflicting desires. This is a classic case where you need some sort of government.
It's not hard to imagine making one or 20 or even 200 decisions about photographs or status updates in a week, but it's mindboggling to consider that Facebook has to process 2 million reports per week, and that's not including simple "mark as spam" messages.
How do you design a system to deal with that workload? I spoke with James Mitchell, who helms what Facebook calls "site integrity" within its user-operations department, and Jud Hoffman, the company's global policy manager about the reporting process. They are the architects of Facebook's technocracy.
"The amount of thought and debate that goes into the process of creating and managing these rules is not that different from a legislative and judicial process all rolled up into one," Hoffman, a lawyer, told me. "And James has the executive/judicial element. I don't think it is a stretch to think about this in a governance context, but it's a different form and we take it really, really seriously."
The key step, Mitchell told me, was to put some structure into the reporting process. Back when he started in 2006, there wasn't any form to complaints from users. That meant there was a massive queue of undifferentiated problems. So, he and his team started to think about what kinds of problems they received and created categories of problems, which they refined over time.
That allows the reports to be channeled through a complex set of processes and teams so that they arrive in front of human beings or computers that know what to do with them.
Facebook has revealed this infrastructure for the first time today. It's the product of more than five years of work by several teams within Facebook, who have worked to make the process of handling this flood of user inquiries as efficient as possible. (Click the graphic to enlarge it.)
At the end of many of these reporting lines, there's a person who has to make a decision about the user's message. Some of these decisions are binary -- Does this photograph contain nudity? -- and those are generally outsourced to teams that can apply simple and rigorous formulas such as asking, "Is this person naked?" Other decisions are complex in ways that make machines very good at dealing with them. (For example, there are more than 50 signals that Facebook's algorithms look at to determine whether a profile is spam, and the automated responses are more accurate than human ones would be.)
But the bulk of the reports are fielded by a faceless team of several hundred Facebook employees in Mountain View, Austin, Dublin, and Hyderabad. These people and the tools they've built have become the de facto legislators, bureaucrats, police, and judges of the quasi-nation of Facebook. Some decisions they make impact hundreds of millions of people in some small way; other decisions will change some small number of people's lives in a big way.
What's fascinating to me is that Facebook has essentially recreated a government bureaucracy complete with regulators and law enforcement, but optimized for totally different values than traditional governments. Instead of a constitution, Facebook has the dual missions of making "the world more open and connected" and keeping users on its site by minimizing their negative experiences. Above all, Facebook's solution to all governance problems have to be designed for extreme efficiency at scale.
As stipulated above, real-world governments have to fulfill all kinds of functions aside from disputes between citizens, but just look at the difference in scale between Facebook's government and Palo Alto's government. Palo Alto has roughly 65,000 residents and 617 full-time employees. Facebook has 900 million "residents" and a few hundred bureaucrats who make all the content decisions.
The original technocrats were a group of thinkers and engineers in the 1930s who revived Plato's dream of the philosopher-king, but with a machine-age spin. Led by Thorstein Veblen, Howard Scott and M. King Hubbert, they advocated not rule by the people or the monarchy or the dictator, but by the engineers. The engineers and scientists would rule rationally and impartially. They would create a Technocracy that functioned like clockwork and ensured the productivity of all was efficiently distributed. They worked out a whole system by which the North American continent would be ruled with functional sequences that would allow the Continental Director to get things done.
Technocracy, as originally conceived, was explicitly not democratic. Its proponents did not want popular rule; they wanted rule by a knowledgeable elite who would make good decisions. And maybe they would have, but there was one big problem. Few people found the general vision of surrendering their political power to engineers all that appealing.
With Facebook, people seem to care much more about individual decisions that Facebook makes than the existence of the ultraefficient technocratic system. They are not challenging the principles or values of the system, so much as wanting them to be applied quickly to resolve their particular dispute. And desire for speed, of course, drives the efficiency-first mindset that makes it hard to deal with nuanced problems. None of the accusations leveled at Facebook's administrative system read to me like criticisms of its core structure.
I mean, of course Facebook's governance isn't perfect. Of course the people who run it make mistakes, mistakes that they use every bit of data to squeeze out of the system. These problems are a consequence of running our social lives through a centralized, corporate social network with a set of rather staid goals: openness, connectedness, and the minimization of negative experiences. Given these goals, Facebook has come to a rational set of structures for dealing with social problems within its walled garden. It is a gated community with some CCRs and if you don't like it ... Well, there's always Brooklyn!
That is to say, the real question is whether Facebook's goals -- and the systems it uses to promote them -- reflect one's own desires. Do you want a clean, well-lighted place that works without any effort on your part? If so, Facebook has the governance structure for you. You want a more permissive place with fewer rules? Allow me to introduce you to 4chan.
This article available online at: