How to Start a Conspiracy Theory on Facebook

On the network, new research suggests, false information can be as attention-grabbing as facts.
More
Shutterstock/Suzanne Tucker

In a 2013 report summarizing global challenges, the World Economic Forum singled out "massive digital misinformation" as "one of the main risks for the modern society." Social networks may be structurally optimized for sharing; their structures, however, don't tend to distinguish between good information and bad. Which means that sites like Facebook aren't just a great repositories for updates from your friends and pictures of your dog; they can also be breeding grounds for rumors, lies, and conspiracy theories. "False information," write Walter Quattrociocchi and a group of colleagues at Northeastern University, "is particularly pervasive on social media, fostering sometimes a sort of collective credulity." 

That's the assumption, anyway. And Quattrociocchi and his colleagues wanted to test it—with a focus on Facebook. On its platform, they wondered, are there meaningful differences in the ways we interact with information that is true ... and information that is false?

The most obvious benefit to research conducted on Facebook is that it gives you not just tons of data to work with, but also tons of structured data. Quattrociocchi and his team took advantage of that as they studied the patterns that emerged among more than 2.3 million Facebook users in Italy. To get a baseline measure of those users' gullibility, they analyzed how those people treated political information posted to the network during the country's 2013 elections. Based on Likes and comments, they analyzed how those users treated news from three main categories of sources: mainstream news organizations, alternative news organizations (sites that share "topics that are neglected by science and mainstream media"), and pages devoted to political commentary.

Once they had that, the researchers compared those results against the same users' reactions to false information—stuff that had been sent into their streams by satirical news sites or lie-spreading trolls. They then measured the timespan between the posting of that information's first comment and its last—a rough proxy for the collective attention paid to it. They wanted to see how long people engaged with information that was true ... and how long they engaged with information that was false. 

Their findings will be, perhaps, unsurprising to anyone familiar with the discursive particularities of Facebook. Basically: The trueness or falseness of the content was largely irrelevant to the length of the conversation about it. The veracity of the information offered didn't seem to matter much when it came to how long people kept talking about it. 

Here's what that looked like, charted: 

The researchers found similar curves when they traced engagement patterns around the content—as manifested through Likes and comments: 

Or, as the researchers summed it up

Attention patterns are similar despite the different qualitative nature of the information, meaning that unsubstantiated claims (mainly conspiracy
theories) reverberate for as long as other information.

What they didn't track was the substance of these conversations; there's a chance that the Facebook users in question were spending all that time debunking the conspiracy theories in question. (But, you know: a chance.) The more salient point, though, is that they were dedicating their attention to those theories at all. They were allowing their time and their minds to be directed by misinformation, rumors, and lies. Which may be good news for your Uncle Stan and his various thoughts about the Kennedy assassination ... but bad news for the rest of us. How, then, do you start a conspiracy theory on Facebook? Throw a crazy idea out there, and see what happens.

Jump to comments
Presented by

Megan Garber is a staff writer at The Atlantic. She was formerly an assistant editor at the Nieman Journalism Lab, where she wrote about innovations in the media.

Get Today's Top Stories in Your Inbox (preview)

Is Technology Making Us Better Storytellers?

How have stories changed in the age of social media? The minds behind House of Cards, This American Life, and The Moth discuss.


Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Is Technology Making Us Better Storytellers?

The minds behind House of Cards and The Moth weigh in.

Video

A Short Film That Skewers Hollywood

A studio executive concocts an animated blockbuster. Who cares about the story?

Video

In Online Dating, Everyone's a Little Bit Racist

The co-founder of OKCupid shares findings from his analysis of millions of users' data.

Video

What Is a Sandwich?

We're overthinking sandwiches, so you don't have to.

Video

Let's Talk About Not Smoking

Why does smoking maintain its allure? James Hamblin seeks the wisdom of a cool person.

Writers

Up
Down

More in Technology

Just In