One professor says the ecosystem of online hate speech resembles a biomass pyramid—with apex predators, such as Alex Jones, at the top.Edgar Su / Reuters

Subscribe to Crazy/Genius: Apple Podcasts | Spotify | Stitcher | Google Play

In the early 1970s, the psychologist David G. Myers conducted a famous experiment on the power of groups. He divided several hundred undergraduates into two camps based on their attitudes toward feminism, creating a conservative cluster and a liberal one. Then he left them all alone to talk. When the groups disbanded, the liberal students had become much more liberal, and the conservative students had veered sharply right.

Today this effect is known as group polarization, and unlike other pseudo-phenomena in the field of psychology, it’s been ratified by several additional studies. Spending long amounts of time with people who agree with you doesn’t just lead to groupthink, these researchers have found; it can also lead to the gradual silencing of dissent and the elevation of, and consensus around, the most virulent opinions. If you want to make people more extreme, you don’t have to threaten them or brainwash them. Just plop them in a like-minded group, and human nature will do the rest.

What does this have to do with the internet? Approximately everything. Social networks such as Facebook and Twitter are many things at once—a modern railroad crossed with a modern telephone network, mixed with a modern phone book, on top of a modern Borgesian library. Above all, social media are a mechanism for allowing people to find like-minded individuals and to form groups with them. To a technologist such as Mark Zuckerberg, this characteristic seemed to promise a new age of transnational peace and moderation (not to mention: profit). But to a social psychologist, it sounded like a machine for injecting public discourse with ideological steroids.

The social psychologists were right.

The latest episode of Crazy/Genius, produced by Patricia Yacob and Jesse Brenneman, analyzes the recent wave of internet-inspired violence—from Charlottesville to Christchurch—and asks why the web became such a fecund landscape for extremism. Hate is an ancient offline phenomenon. But something about the design of our social-media platforms—and perhaps something inherent to the internet itself—has amplified the worst angels of our nature. (Subscribe here.)

The psychological roots of online hatred have three levels. At the bottom, there is group polarization and the natural tendency of moderate people to become extremist versions of themselves when they interact with like-minded peers. At the next level, there is what you might call Viral Screaming Syndrome—the natural tendency of web content to veer toward high-arousal emotions, such as outrage and paranoia, to attract attention and promote social sharing. “Video is really expensive to make, and reported video is really, really expensive to make,” says the Atlantic staff writer Alexis Madrigal. “You know what’s not expensive to make? A bunch of random, paranoid opinions to cut through the noise.”

Finally, the largest social-media networks have built algorithms that exacerbate both group polarization and the Viral Screaming Effect. For example, YouTube executives knew that extreme and misleading videos were racking up tens of millions of views, but the company’s executives declined to intervene, because they were “focused on increasing viewing time and other measures of engagement,” according to a Bloomberg report in April.

“The No. 1 thing, though, that happens when you look into the failings of the various platforms, YouTube and Facebook specifically, is you see this kind of fractal irresponsibility,” Madrigal says. “They were launching in countries where they literally would have no idea what people were saying. And now they’re being asked to defend elections, they’re being asked to, like, understand deeply the social dynamics of every place in which they are.”

If you think the chief driver of online extremism is algorithmic, then your preferred solutions are likely to be algorithmic tweaks. That’s not enough. After all, the internet’s failures exist only because of the natural tendencies of group behavior. To fix social media’s problems, you have to address them at the level of the group.

For Whitney Phillips, an assistant professor of communication and rhetoric at Syracuse University, the ecosystem of online hate speech resembles a biomass pyramid—with apex predators, such as Alex Jones, at the top, and the rest of us playing the role of worms and fungi at the bottom. “The reason that Alex Jones and other abusers and bigots have that platform is because other people engage with them,” Phillips says. “These messages are able to spread way further and way more quickly than they ever would have been able to do on their own because we share them.” The virus of hateful extremism grows because so many people share it—even if they’re just trying to point out how terrible it is.

This might seem like a catch-22: Ignore the bigots and become a bystander, or criticize them publicly and become their amplifier and accomplice. “The issue is generations old,” Phillips says. “If you go back and look at the first-wave Klan in the 1860s and ’70s, they used the news media and boasted to reporters about how many more followers they would get because of their story.”

Fighting extremism requires a pyramid-shaped strategy. At the top, it requires that social-media companies take stronger steps to ban the predators and monitor hate speech, as they’ve already begun to do. But the rest of us, the understory of the biomass pyramid, can do better too. If we understand the psychological origins of online extremism, we can play an equally important role in limiting the spread of the most hateful ideas.

Today’s internet users have the broadcast power once reserved for journalists, but journalistic power comes with journalistic responsibilities. Phillips says internet users ought to act more like a responsible press. One way is for users to ask themselves a simple question before pressing the share button: “What does the sharing of this content achieve? Does it merely make my social network a more extreme, chaotic, and frothily outraged claque, or does it add context and understanding to the world?” Some extremism deserves public condemnation. But sometimes it’s best to resist offering hateful speech the oxygen of amplification.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.