The Trump Decision Turned Content Moderation Into Shark Week

This is new, and this is bizarre.

A flickering gavel on a TV screen
Adam Maida / The Atlantic / Spiderstock / Getty

This morning, the oversight board, a putatively independent body funded by Facebook through a $130 million trust, announced a decision in its tenth case: The removal of Donald Trump from the social platform had been justified, it said, but poorly executed. The meaning of his “indefinite suspension” wasn’t very clear, nor are any of Facebook’s existing policies about world leaders. The company has been given six months to decide what to do and get back to the board.

Okay, and no surprise. What might be surprising, if you haven’t been tuned in to content-moderation chatter for the past 10 years, is how much of an event this has become. Never before has a blog post about an account suspension been plausibly described as “anticipated.” The board’s announcement was preceded by a week of “what to expect” coverage in national outlets (CNN had live updates!), as well as months of feature-length reporting about the creation, motivation, limitations, and generally futuristic—or dystopian, depending on who you ask—vibe of something nicknamed the “Supreme Court” of a social-media site. (The New York Times’ Ben Smith wrote in January that he pictures the board members—including prominent lawyers, heads of nonprofits, a Nobel laureate, and a former prime minister of Denmark—“wearing reflective suits and hovering via hologram around a glowing table.”)

The board itself teased the decision in this case, first promising on April 16 that its answer would be revealed at some point “in the coming weeks,” then saying that the ruling would drop on May 5, at “approximately” 9 a.m. On Monday, when the board finally released its final decision about when it would release its final decision about Trump, the Aspen Institute announced that registration was now open for a thrilling webinar, “Deplatforming Trump: The Facebook Oversight Board Decision,” to be held tomorrow with actual members of the board. This will be their “first LIVE public appearance.” When I asked Vivian Schiller, the executive director of Aspen Digital, the program that is hosting the webinar, how many attendees she was expecting, she wouldn’t guess, but joked that, given the importance of the moment, the event would be huge. “Think Super Bowl,” she said.

Trump decision week! It’s like Shark Week, but less scenic. Now that Facebook has been granted half a year to explain to the board whether it really meant to ban Trump forever, the event is ongoing. This is new and this is bizarre. There is plenty to consider in the oversight board’s list of recommendations for reform at Facebook—as there has been in several of its previous decisions. But we might also consider what it means that individual content-moderation decisions are now media spectacles. “Here we are in 2021 and the biggest judgment coming down the pike in the news cycle is one from Facebook’s made-up court,” remarked Sarah T. Roberts, an assistant information-studies professor at UCLA who researches content moderation, when we spoke yesterday. “That’s weird.”

In January, following his encouragement of the violent insurrection at the U.S. Capitol, Trump was deplatformed from essentially every major social-media site he had ever used. At the time, liberals rejoiced and prominent conservatives cried censorship, while free-speech experts expressed tepid, sometimes worried support. On the day of Joe Biden’s inauguration, Facebook’s vice president of global affairs, Nick Clegg, announced on the company blog that the matter of Trump’s indefinite suspension would be referred to the oversight board.

The board could decide whether Facebook’s decision was justified, Clegg wrote. It could also offer thoughts on how the suspension of world leaders should be handled in Facebook’s policies. “We believe our decision was necessary and right,” yet, on account of its “significance,” the company believed it was a good candidate for review by “the first body of its kind in the world: an expert-led independent organization with the power to impose binding decisions on a private social media company.”

Thought leaders of various stripes offered public comment to the board, including a handful of House Republicans who wrote a letter about censorship of “conservative viewpoints” but did not actually state whether they believed Trump should be allowed back on Facebook. And in February, the board said that Trump had submitted a statement on his own behalf. (That statement, excerpted in the board’s decision, is uncharacteristically dry. It asks for Trump’s account to be restored and claims that “all genuine Trump political supporters” who attended the Capitol riot “were law-abiding.”) The oversight board received 9,666 comments in all—a dramatic number, compared with its previous record of 35—and cited the need to read them all when pushing back its deadline for making a decision. Many of the comments have been published in an 8,599-page PDF.

Yet apart from some angry responses to the oversight board’s announcement in late January that it would take the case, I saw very little organic online conversation about the impending decision. That’s in part because many of the hubs where Trump supporters used to gather have also been removed from the internet. Notably,, the MAGA world’s onetime home, was shut down by its domain owner following the Capitol riot. Parler, the Twitter look-alike once known as the home of “Stop the Steal,” has been offline for maintenance for most of today. Trump launched a new website called From the Desk of Donald J. Trump yesterday, although it’s not a place for discussion. The site allows him to make social-media-like posts, but nobody can interact with them directly and Trump operates the only account. This morning, he was mostly posting about his nemesis Liz Cheney, though he did make one reference to the Facebook decision: “Free Speech has been taken away from the President of the United States.”

Still, some Facebook critics say it’s obvious that the whole point of referring this case to the oversight board was to create a media event. This one content-moderation decision should have been obvious all along—yes, Trump needed to be deplatformed—so any continuing debate serves only to distract from the company’s broader problems with hate speech and misinformation, argues Jessica González, a lawyer and co-CEO of the nonprofit Free Press. “They want this to be a sideshow,” she told me. “Meanwhile, conspiracy theories, lies, vaccine misinformation are all up on Facebook. We’re looking at the shiny object while they’re not doing their job.” (Facebook declined to comment on this claim.)

When I suggested that some of the spectacle comes from the fact that the case involves Trump—a known, nonstop spectacle—she countered that far less spectacle had developed regarding Twitter’s decision to ban him. It was a controversial choice when it was made, but now that conversation is over. González, for her part, is a member of the purely symbolic and self-declared “real” Facebook oversight board, which was formed in September as a stunt, to highlight the absurdity of the official oversight board. A spokesperson for the “real” board expressed a sentiment similar to González’s: “This is a timed announcement aimed at amplifying drama.” (The oversight board did not return a request for comment.)

The “real” board has been playing into the decision drama quite a bit too, just as I am now. (The Atlantic has now posted three articles about the decision; more are on the way.) In recent days, it started launching ever spicier tweets; on Monday, the group announced on Twitter—with siren emoji—that it would respond to the oversight board’s decision within 90 minutes of its publication. As promised, the “real” board came out quickly with a response to the “completely toothless body,” criticizing it for tossing the big Trump decision back to Facebook and Mark Zuckerberg.

This decision is not a joke, of course; nor is Facebook’s power, even if this particular situation is approaching slapstick. Content moderation impacts politics and culture and people’s daily lives, Roberts reminded me. Early on in her career, she said, and in the early days of social media, people thought that problems regarding content moderation could be quickly resolved. “The opposite is true,” she said. At this point, there’s a whole culture around debating them.

Soon after the board’s decision was released, news anchors went on air with expert guests, my inbox flooded with press releases and offers of commentary, and a popular internet-minded newsletter predicted “scattered takes, becoming heavy in the afternoon, but tapering to gags and rants by late evening.” Former White House Chief of Staff Mark Meadows called it a “sad day for America,” but I think plenty of people would say it was at least a little exciting.