Staff writer James Hamblin has been getting a lot of strange replies on social media. He recently wrote about hydroxychloroquine, the drug President Donald Trump has touted as a treatment for COVID-19, and it seems the Twitter trolls have come for him.
After reporting on the “disinformation architecture” aimed at helping Trump get reelected, staff writer McKay Coppins wrote about how that set of tools has been put to use during the pandemic. He joins Hamblin and executive producer Katherine Wells on the podcast Social Distance to talk about modern propaganda and the history of disinformation.
Listen to their full conversation here:
What follows is an edited and condensed transcript of their conversation.
Katherine Wells: Okay, here’s the deal, McKay. Jim and I talked last week about hydroxychloroquine. He's been tweeting about it, and I was checking out his Twitter responses, which seem angry at him for offering skepticism about this medication. I’m fine to believe that there’s a group of people who are just really irritated by Jim on Twitter. But I wanted to ask you if there could be something else going on.
McKay Coppins: What’s happening is there are at least some Twitter accounts that appear to have been designed just in the last couple of months exclusively to boost this drug. And then there are a lot of people advocating for it for political purposes, which is that they want to validate or vindicate President Trump. And then there’s probably also a group of people who genuinely believes in it. But a lot of the Twitter activity and social-media activity around this drug is not exactly what it seems.
Wells: Is this just a Twitter thing?
Coppins: Oh, definitely not. If you turn on Fox News, if you listen to conservative talk radio, if you spend time on certain Facebook groups, this conversation is taking place everywhere. And there are people driving this agenda in kind of a sophisticated way, honestly, and seem to be kind of convincing a lot of people who want to believe it.
James Hamblin: For me, the consistent theme I see is sowing chaos—just wanting people to appear divided along partisan lines, wanting people to appear to disagree for almost no reason at all. Why do you mobilize these bots around hydroxychloroquine? It’s a generic drug. No pharmaceutical company stands to massively profit off this. I can’t follow any motive other than: We want people to appear divided. We want a divided America along partisan lines. And here’s one way we can do it.
Coppins: Yeah, in politics, these are called wedge issues. Candidates seek out specific wedge issues to create that division and stake out your position on one side of the divide. And when the debate is about educational reform or taxes or whatever, there clearly are real ideological dividing lines. And it makes sense that you would want to create a wedge and force a debate or a conversation. But when you apply that political tactic to an unproven drug, it’s a whole different situation. You’re fanning the flames of division around something that is not ideological; it’s not political or partisan. It’s a matter of science that we don’t have enough scientific data to prove what, you know, one side is arguing for.
And so instead it just becomes an online shouting match and turns it into a way to demonize the other side or prop up your own side. Not to make this too philosophical, but I always think about Hannah Arendt, the political theorist who wrote about the big successful totalitarian regimes of the 20th century. And I’m paraphrasing here, but she wrote that the purpose of propaganda is not to instill conviction. It’s to destroy the capacity to form any. You’re just trying to make people cynical enough that they’re incapable of grasping a certain idea or reality. You just want people to be malleable and cynical.
Wells: If you just flood them with a certain message at the right time, you can get them to act in a certain way.
Coppins: Yeah, exactly. That’s what’s actually interesting about this particular playbook, because if you talk to scholars who study propaganda and disinformation, what they’ll say is that up until pretty recently, most autocratic regimes or even just kind of illiberal political leaders would try to censor dissenting voices and inconvenient information. They would shut down opposition newspapers and throw journalists and political dissidents in jail. That’s how they kind of maintained control and power.
What you’ve seen in the last 10 or 20 years is that a lot of the illiberal regimes around the world have realized that in this era of what’s called “information abundance,” where everybody has the internet, everyone has social media, everyone has TV and radio and books, it’s very hard to fully contain the spread of information. It’s much more effective to flood the zone with lots and lots and lots of content and propaganda and disinformation and noise. And what this is called is censorship through noise. Basically, you’re drowning out the dissenting voices rather than throwing them in jail.
Wells: I remember one time I had a conversation with someone who grew up in China, and we were talking about the misinformation in Chinese media and state-controlled media and things like that. And I was like, “Oh, that seems so disorienting.” And I remember she said, “Well, in China, we just know not to trust it. But in the U.S., you still actually believe the things you hear.”
Coppins: Yeah. That’s such a good insight and an important point. I do think that that is a major problem in our society, and it’s born out of something good, which is that, compared to a lot of other parts of the world, we’re actually not used to our own government waging coordinated disinformation campaigns against us.
If you compare us as a people to, for example, people in Eastern Europe or the Baltic countries, who have spent generations dealing with Russian disinformation and Russian propaganda, you’ll find that they are a lot more savvy about it, and frankly a lot more cynical. We also have this fundamental belief, which I think is generally good, in free speech. We really believe that dissenting voices and opinions shouldn’t be censored. And we kind of instinctively push back against any effort to censor speech.
Wells: But that is, like, a sort of ethic that comes from a time when the tool of control was censorship rather than flooding.
Coppins: Exactly. And you read like all the famous novels that are about future dystopia—they’re all very concerned with censorship, like the state coming in and burning books or sticking old newspaper articles down the memory hole. That idea colors so much of the literature about authoritarianism. But in this modern era, that’s really not how it works, at least not in most democratic or ostensibly democratic countries.
Wells: Does a functional democracy depend on a shared sense of reality?
Coppins: Yes, I think that the entire American experiment is premised to a certain extent on the marketplace of ideas and free debates and free exchange of views. But I think the marketplace of ideas only works if objective reality serves as a regulating force. We all have to be starting from the same set of basic facts to have these big important ideological debates or political debates. And if we all have our own tailor-made set of facts, or, alternative facts, then it effectively makes it impossible for us to reach consensus on any issues.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.