Stephen Lam/Reuters

Let’s talk about Facebook. Strike that: We’re already all talking about Facebook. Pew reported last month that two-thirds of Americans are on Facebook—up modestly from 2016—and more than half of them use it to get news. Not everyone is happy about it. Many of you have written in to express concern that The Masthead uses Facebook for our private discussion group. Today, we’re going to start a series taking that concern head-on. We’ll start with the dark side. I’ll talk to Alexis Madrigal about his reporting for The Atlantic about Facebook’s place in American democracy, and I’ll relay my conversations with you all about Facebook. In coming issues, we’ll talk about what’s valuable in the world’s largest social network.

Before we get into it, Derek Thompson raised some of these same questions—and more—in his conference call with Caroline Kitchener yesterday. If you missed it, you can listen back and download the transcript here.

FACEBOOK’S POWER OVER DEMOCRACY

Last week, Alexis Madrigal reported on Facebook’s effects on American democracy. “The information systems that people use to process news have been rerouted through Facebook, and in the process, mostly broken and hidden from view,” he wrote. I wanted to know more, so I called him up. This is an edited transcript of our conversation.

Matt Peterson: Am I right to think that Facebook has gotten some of its power by accident? As you wrote, they became dominant in news consumption because they were trying to take out a competitor, Twitter. They also didn’t expect governments like Russia would do propaganda organically, as opposed to through ads. Did they see all this coming? Are they freaked out about this in the way that we are?

Alexis Madrigal: It was intentional that they got all this power. They were following corporate imperatives that make sense within a competitive capitalist economy. A lot of what we’re seeing are the externalities that result from that. The literature people developed to talk about environmental economics are useful here. It’s not as if coal power plants want to put out lots of mercury, it’s just a function of burning coal. For a long time, you could say the benefits of burning coal perhaps outweighed the downsides. And then global warming came along and there was a reweighting of the pros and cons of burning coal.

Are they as worried about it as we are? I don’t think they were until quite recently. The election, like in many places, was a turning point. There’s two reasons. One, Trump won, and for most people in Silicon Valley, that was, if not shocking, not a good thing. They expected things to stay within a fairly stable political range. And Trump was seen as outside of that.

The second thing is just that the trigger moment of the election showed some of the weaknesses in what they would call “information integrity.” That is now considered to be an existential problem for the company. The solutions that Facebook decides to implement will have wide-reaching effects on future elections in this country and other places, and on just what Facebook feels like. If you look at the way Mark Zuckerberg has responded to 2017, it’s been a much broader rethinking of the philosophical underpinnings of Facebook. And that’s probably a good thing.

The thing that scares me the most about the coming period, where there’s this major re-evaluation happening, is some things might be easy to fix. There has to be a publicly available database of dark ads and whatever. The things that are hard, though, are the chaos elements. It’s like an island ecosystem you’re trying to manage, except a billion people’s minds are hooked up to it. If you introduce some predators to eat up the bad stuff, then maybe they eat up too much of the good stuff. If you try and replant one part of the island to change the environment to encourage the growth of beneficial plants, there may be unintended consequences.

That’s scary to me. I want them to make changes, I want them to think more deeply about all of those things. It doesn’t seem like it could get worse. But if the last couple of years have taught us anything, it can always get worse. It can always get weirder. It can always get more chaotic.

Matt: Tell me about the process of dealing with Facebook in writing a story like this. They didn’t initially comment on the record, but eventually sent a comment that you added as an update.

Alexis: Facebook rarely comments on the record in a substantive way. But in recent times, they are realizing they have to engage more. I think that’s good. It’s unclear what the result of that will be, and I imagine that there’s an internal battle within Facebook about how much to disclose, how much to talk about. But they’re talking more.

Matt: I’ve been thinking about this in relation to White House reporting. You can report on the White House, and the White House can do things to you, but not the same direct way that Facebook could do things to you as a tech reporter. How do you think about that as you report about this stuff?

Alexis: I try never to think about it, really. To the extent that Facebook controls media distribution—which I think is to a very large extent—I’m sure they could just go in and zero out me somehow. I honestly don’t think they think about things in that way. We have never seen evidence that that’s ever happened, and we’ve written plenty of mean things about Facebook through the years. That’s not to rule it out in the future. The kind of power that someone you’re reporting on has over you really matters. But it’s always been, with this company, that you have to report outside in, and it’s not an access-driven game.

And then on top of all that, right now, it’s actually hard to get inside technological development stories, which used to be kind of bread and butter for the internet. “How did this come to be? Who was the team who worked on it?” All that stuff is very significant, and I love reporting on it. People don’t want to read that right now. “I don’t care how this technology was developed, it’s ruining everything!”

As our members are reading coverage—not just ours, but more broadly—I do think it’s worth considering that the same kinds of optimization forces that we’re describing are also acting upon us. As people see the negative feedback on Facebook start to increase, part of the reason is that they want to read that. As reporters and as people who try and represent the truth to our readers as well as we can, we have to be conscious of the way those feedback loops act on us. That coverage feeds on itself, it’s a causal mechanism for driving things. We want to represent reality. We’re not trying to write things about Facebook just to be jerks. I’m trying to maintain that level of self-awareness.

THE CASE AGAINST FACEBOOK

Since we’ve launched, I’ve been in touch with many of you about your views about Facebook. Here are a few representative arguments against the platform.

For many, it’s simply a privacy issue. Being on a platform that requires you to use your real name, and that sells your private data to advertisers, just isn’t appealing. As Ed told me, “I am not anti-technology, but rather a private person who has so far eschewed Facebook.”

“Facebook, in many ways, stands for the opposite of what you’re trying to accomplish,” said Gerald, echoing the view of several members who see Facebook as a threat to journalism, and specifically to The Atlantic. “I’ll not go on Facebook, do not want to be ‘sold' to anyone, nor have my data mined. I hope you’ll keep that in mind as you try to create a better journalism.”

Jack argued that “Facebook is just not robust enough for meaningful conversation.” Sue shared a similar thought, adding that she misses the Ta-Nehisi Coates–style blog comments section. She’s on Facebook, but it doesn’t work for her. “I don't find the Facebook format particularly conducive to discussion on a longer piece.”

Several of you touched on the threat that Facebook’s monopoly—and the filter bubble it creates—poses to a healthy political culture. Rahul said, “Facebook appears to be a cruel confluence of instant gratification and algorithms, creating a rather dystopian ‘safe space,’ where, in true Aldous Huxley style, one's mind can stay habitually unchallenged and thereby fail to grow (or worse, atrophy). This in itself might not pose a problem to society if there were additional sources of contrasting viewpoints (newspapers, TV, radio, etc.). However, a large portion of the country gets its news solely from this source.” (I should note that a majority of Americans get some of their news from Facebook but not necessarily all of it.)

But Facebook isn’t just a computer program. “Algorithms reflect the creator and whatever’s brought to bear including ethics, biases, and unintended consequences,” said Ruth. And those decisions have led to an evolution over time, from what some saw as boring but benign, to something else. Initially, Dave told me, “it seemed the grandest time waster that humans had ever made, but people can choose to grandly waste their time if they wish.” Now, he says, the “platform has far too much power, and the people running it are either unwilling or unable to control it.”

TODAY'S WRAP UP

Question of the day: We’ve focused on negative stories, but there are many positive ones, too. What are yours?

Your feedback: We’re constantly thinking about how to make this membership better for you, even about how we do our surveys! Take a second and let us know how we’re doing.

What’s coming: Tomorrow, Caroline Kitchener writes about the Harvey Weinstein scandal and the role that human resources departments can play in sexual harassment and assault cases.

What we’re thinking about: Illustrating that there’s two sides to every story, Masthead’s Facebook group has hosted a lively conversation about gun control. Later this week, we’ll kick off a debate with you about it.

Matt Peterson

EDITOR, MASTHEAD

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.