Was Facebook responsible for the election of Donald Trump in 2016? Trump’s campaign says yes. Most of his opposition says yes. And now a ranking executive at Facebook, Andrew Bosworth, says yes. “I think the answer is yes, but not for the reasons anyone thinks,” Bosworth wrote in an internal “Thoughts for 2020” post that leaked yesterday, and that he subsequently posted in full. “He didn’t get elected because of Russia or misinformation or Cambridge Analytica.”
So what did get him elected then?
First, you have to know who is talking here. Bosworth, known inside and outside Facebook as Boz, is the company’s id. As one of Mark Zuckerberg’s computer-science teachers and a very early employee, Bosworth has unusual latitude to say the quiet parts of Facebook’s self-conception out loud. He does this in posts that Facebook employees can read—and although they are far from official announcements, this isn’t just some guy talking or even a hot-mic moment. Boz is Facebook rootstock, and what he says reflects, at the very least, part of the conversation swirling around the company’s executives. (I reached out to Facebook, and the company had no further comment on the memo.)
The crux of Bosworth’s post is that Facebook, fundamentally, doesn’t need to change. Maybe some tweaks here and there (“get ahead of polarization and algorithmic transparency”), but the system as a whole is sound. In the Facebook mind-set, this seems to suggest that the social network does not meaningfully distort the “natural” preferences that people have. Whatever mess you see on its platform, it is the same mess that exists out there in the world, and Facebook is a fair playing field on which seekers of attention are rewarded roughly in accordance with their quality.
This is the context for the crucial line in Bosworth’s post: “[Trump] got elected because he ran the single best digital ad campaign I’ve ever seen from any advertiser.” In other words, Donald Trump earned the Electoral College victory. Facebook played a crucial role, but merely as a conduit for fair-and-square campaigning. Trump’s campaign was running orders of magnitude more types of ads than the Clinton campaign, and it does seem like it was effective.
It’s easy to imagine the continuation of the argument: Would anyone blame the medium of television for John F. Kennedy’s 1960 victory just because Kennedy was so much better than Richard Nixon on TV? One might even ask the same thing about the direct-mail revolution—and its great kingpin, Karl Rove. Would you ban the mail just because some political operatives got good at using it to win elections?
Since the 2016 election, many different and sometimes conflicting critics have sprung up. Among the complaints: an openness to exploitation by bad-faith actors like Russian operatives, the propensity of the system to propel conspiracy theories and fake news, a lingering sense that the company has not fully accepted responsibility for the content that courses through its pipes, and a fractal irresponsibility epitomized by the company’s slow-motion responses to Facebook-inflected human-rights crises, such as that in Myanmar (also called Burma).
Meanwhile, people like the United Nations’ special rapporteur on human rights worry that Facebook could become a back door for speech suppression in authoritarian regimes without a better grounding in principle, and the far right accuses Facebook of being unfair to them, despite right-wing content thriving on American Facebook and around the world.
Bosworth extended his logic to the rest of the platform too, arguing against “limiting the reach of publications who have earned their audience, as distasteful as their content may be to me.”
But as the journalist Joshua Benton pointed out, “earned” is doing a lot of work there. Facebook has worked closely with many media companies over the years, pushing and pulling them with different incentives. Low-end publications have grown by posting stolen content, hoaxes, conspiracy theories, and lies. And it is at least plausible that Facebook’s reward system encourages sensationalistic, schmaltzy, or truthy content. Monthly lists of the most-shared stories tend to show exactly that.
To generalize: Over the years, Facebook has not been good at anticipating the second-order consequences of its actions. It reshapes people’s behavior and companies’ investments, and then is surprised when that changes the system.
Bosworth is fully within the individualistic traditions of Silicon Valley, which tend to see people as society-free market particles interacting with one another. In his memo, he compares Facebook to food companies. “What I expect people will find is that the algorithms are primarily exposing the desires of humanity itself, for better or worse,” he wrote. “This is a Sugar, Salt, Fat problem. The book of that name tells a story ostensibly about food but in reality about the limited effectiveness of corporate paternalism. A while ago Kraft foods had a leader who tried to reduce the sugar they sold in the interest of consumer health. But customers wanted sugar. So instead he just ended up reducing Kraft market share. Health outcomes didn’t improve. That CEO lost his job. The new CEO introduced quadruple stuffed Oreos and the company returned to grace.”
But there are shelves of books that paint a much more complicated picture of why Americans eat so much sugar. There are historical reasons for our huge corn plantings, which made high-fructose corn syrup a major product. There are many decades of marketing and brand building, which presumably have had some effect on what people eat. There are the deep economic inequities that create lucrative markets out of poor people. Americans today eat 700 more calories a day than Americans of the early 1960s. If humans are just humans, how could they change so quickly?
All of this has contributed to the massive increase in the cost of health care. There’s some personal responsibility to be sought there, sure. But when life-expectancy gaps between zip codes in a city can be 10, 20, or even 30 years, maybe it’s a better idea to look at the policy reasons and corporate imperatives that generate these problems. At the same time, as Michael Pollan has been pointing out forever, climate change and cheap industrial food are linked through the fossil fuels used (or released) in their production. It’s possible to say that people should just change their eating habits or die happily eating bacon and whipped cream every day—but that won’t do a thing to fix the collective problems in health care or climate. To do that, you have to look at the rest of the picture. History, structures, incentives.
The problem for Facebook, in that regard, is that it created the entire system. The history is its history. The logic of distribution on the site—which is predicated on virality, on making things more popular that are already popular—is its logic. It has made the decisions to quantify likes and shares. And it is constantly reshaping the topography of attention. It’s silly to say you can’t change the way a river flows when you are the watershed.
Bosworth acknowledges that Facebook should metabolize criticism, and in that vein, it does seem worth evaluating whether the hyper-partisan ecosystem of pages and people that grew up around the Trump campaign reflects the functioning of a healthy social-media platform. That’s not a call to ban Trump, but to examine the Facebook system’s role in his rise to power.
As for Kraft, the story continued after Bosworth’s anecdote ended. The company did recover, merging with Heinz in 2015 after a big international expansion. But Kraft Heinz shares have gone down more than 60 percent over the past few years, massively underperforming the S&P 500. One reason is that consumer demand is forcing the company to spend on new, healthier offerings. Generally, the whole food industry is in tumult: In the spring of 2019, one publication counted 16 major food and beverage companies that had recently changed chief executives.
Which suggests: Maybe you can’t grow forever by feeding people junk.