Dado Ruvic / Reuters

For a lot of people, the news about Cambridge Analytica has packaged together all of their anxieties about technology and politics with a bow. In today’s issue, we’ll try to triage questions about the electioneering firm and its abuses. Abdallah Fayyad is here with a guide to the aspects of the story that are likely to leave a lasting impact, and those that—as far as we know now—won’t. Meanwhile, Karen Yuan talks to ethicists, and brings back an answer to a question on a lot of people’s minds: Should I delete Facebook? Read through, let us know what questions are still lingering, and we’ll come back to them in a future issue.

—Matt Peterson

PS. It might strike some readers as odd that we’re writing about Facebook’s many problems while the Masthead discussion group is still hosted on that platform. If you’re one of those readers, you’re in luck. We’ll be expanding the number of options for discussion very soon. Stay tuned for an announcement.   


A Brief Guide to the Cambridge Analytica Scandal

Much has been written about Cambridge Analytica since The New York Times reported that the political consulting firm exploited private data belonging to tens of millions of Facebook users. But as the story continues to develop, some aspects will prove more interesting than others. To make sense of it all, here’s a list of the bigger stories that will most likely shape the coverage of the scandal as it unfolds and the lesser stories that will probably be left behind.

The bigger stories:

  • A failure to hold Facebook accountable. The problems the Cambridge Analytica scandal revealed have been open secrets for a while now. Facebook users have been shocked to learn that Cambridge Analytica managed to acquire private data from over 50 million people by taking data from an otherwise innocuous-looking personality quiz. The information they collected, which included data on what users liked on Facebook, could, according to Cambridge Analytica, tell its clients important details about the users’ personalities. But The Guardian broke this aspect of the story as far back as 2015. It took the renewed focus from the current round of allegations to make regulators think about what they can do to protect people’s privacy and hold Facebook accountable for this data breach.

  • What the future holds for the social media giant. On Sunday, Mark Zuckerberg took out full-page ads in The Washington Post, The New York Times, and The Wall Street Journal, along with several British newspapers, to apologize for what he called a “breach of trust.” Since the Cambridge Analytica scandal broke, Facebook’s stock dropped by 14 percent. It’s proving difficult, if not impossible, for the company to remove itself from the scandal, and the 2016 election as a whole.

  • The Trump campaign’s ties to Cambridge Analytica. Special Counsel Robert Mueller has been investigating the relationship between the president and the political consulting firm. Republican donor Robert Mercer donated $15 million to Cambridge Analytica, and Steve Bannon served as its vice president. The Trump campaign has attempted to distance itself from the firm, saying it did not use the firm’s data. But the campaign might have other legal problems from its association with the firm. As Robinson Meyer pointed out, it would have been a violation of federal law if any of Cambridge Analytica’s employees who weren’t U.S. citizens or green card holders worked on the Trump campaign.

The lesser stories:

  • Cambridge Analytica’s influence on the outcome of the election. The company did extract enough data from Facebook to create “psychographic profiles” on tens of millions of Americans, which it claims allowed it to tailor political messages to influence voters in swing states. Still, it’s unlikely that the company played a central role in swinging the election in Donald Trump’s favor. While there is no question that the firm had some impact on voter behavior, in an election riddled with fake news and outside interference, Cambridge Analytica seems to have merely contributed to—but not determined—the outcome.

  • The effectiveness of  Steve Bannon’s brand of “psychological warfare.” Since The Guardian reported that whistleblower Christopher Wylie created a tool to use big data and social media to hack the minds of American voters, people have been talking up Bannon’s supposed strategic genius. The so-called psychological warfare tool was supposed allow campaigns to conduct “information operations”—a military tactic to disrupt adversaries’ decision making—against the American electorate. But the experience of many of Cambridge Analytica’s clients, including Senator Ted Cruz’s failed presidential campaign, show that the firm’s strategy still requires quite a bit of luck as well.

  • Cambridge Analytica’s role in spreading fake news. One of the big stories last week was Channel 4’s hidden-camera footage of Cambridge Analytica executives discussing their tactics, which included bribery and spreading fake news. While the firm may have had a hand in sharing fake news, its impact was comparatively small. A recent study showed that the internet is generally vulnerable to the spread of fake news for reasons that Cambridge Analytica neither created nor was particularly adept at taking advantage of.  

—Abdallah Fayyad


Are You Morally Obligated to Delete Facebook?

Facebook’s mishandling of personal data has given rise to #DeleteFacebook, and now the conversation has turned to the broader role Facebook plays in our lives. For some, the tech giant might be doing more harm than good. This raises an interesting moral question: If we find the platform unethical, do we have a personal responsibility to delete it?

I talked to two ethicists who broke this question into two parts. The first is a look into how things are: What and how much harm is Facebook causing us? That prompts a look into how things should be: If we believe that using Facebook causes us enough harm, what should we do?

Quantifying Facebook’s harm

“We don’t yet have a full picture of how tech affects us,” said Molly Crockett, a professor of moral psychology at Yale. If an obvious cause-and-effect relationship was discovered between Facebook’s collection of our data and political polarization, for example, that would help prove that Facebook itself is, indeed, harmful. But for researchers, since very little social science data about Facebook usage is available, that’s been hard to determine.

That action should be informed by more public understanding of the platforms, their effects, and how they relate, said Irina Raicu, the director of the internet ethics program at Santa Clara University. “We need basic education before making calls for change,” she said. To her point, some people have called to migrate social media use from Facebook to Instagram, not knowing that the former owns the latter.

What we’ve learned from the Cambridge Analytica revelations may count as definitively harmful to some people. But that still doesn’t clear up the second question.

Should Facebook detractors delete their accounts?  

For some, deleting Facebook isn’t easy. “Facebook is not a luxury for some people,” said Raicu. Those who can’t afford long-distance trips or calls need Facebook to keep in touch with friends and family. Not everyone can delete their pages on a dare, like Elon Musk. For others, deleting Facebook may feel too easy, a way of avoiding the hard work of improving platforms like Facebook, and using them for social good.

“I see the importance of a symbolic gesture, of collective action that expresses a point,” Raicu said. As an example, she brought up a Facebook boycott, in which Facebook accounts would briefly go dark to encourage the company to more closely focus on current privacy concerns. A boycott would also be easier for people who can’t leave the platform. The irony, Raicu said, was that such an event would likely be organized on Facebook.

Facebook may dominate the internet, but its problems are shared by many tech companies. “A worry is that tech companies are geared towards entirely first order desires,” said Crockett, referring to desires felt in the moment, like the desire to eat a bag of chips. Second order desires, on the other hand, are larger: If your long-term goal is to lose weight, for example, you might desire not desiring those chips.

Incentivized to grab our attention, tech companies may be giving us what we want, but it’s a narrow definition of “want.” So deleting Facebook may not address the broader problem. “Facebook is only the poster child of a moment,” said Crockett.

—Karen Yuan


The Banality of Facebook

The Cambridge Analytica story resonates in part because it touches on the fundamental processes of democracy. But as the writer and app developer Ian Bogost explained in an essay for The Atlantic, what makes the story so important is that not just shadowy international consultants had access to your online life. Back in 2010-2011, Ian obtained much of the same data through a much more trivial process: by creating a game called Cow Clicker. “Cow Clicker is not an impressive work of software. After all, it was a game whose sole activity was clicking on cows,” he wrote. But all that cow-clicking gave him access to a breathtaking array of data about his users. Here’s Ian:

Cow Clicker’s example is so modest, it might not even seem like a problem. What does it matter if a simple diversion has your Facebook ID, education, and work affiliations? Especially since its solo creator (that’s me) was too dumb or too lazy to exploit that data toward pernicious ends. But even if I hadn’t thought about it at the time, I could have done so years later, long after the cows vanished, and once Cow Clicker players forgot that they’d ever installed my app.

This is also why Zuckerberg’s response to the present controversy feels so toothless. Facebook has vowed to audit companies that have collected, shared, or sold large volumes of data in violation of its policy, but the company cannot close the Pandora’s box it opened a decade ago, when it first allowed external apps to collect Facebook user data. That information is now in the hands of thousands, maybe millions of people.


Today’s Wrap Up

  • Question of the Day: What questions do you still have about the way that politics and the internet are colliding in this election cycle? Write back, and we’ll revisit them in a future issue.

  • What’s Coming: Wednesday, we learn from features editor Denise Wills how “My Family’s Slave,” our June cover story, came to be.

  • Your Feedback: We want to know your thoughts on The Masthead. Click below.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.