Daniel Becerril / Reuters

In the 1970s, the business professor Rosabeth Kanter published an influential account of an American company that had recently recruited women to its sales team. The quality of those women’s working lives, Kanter noted astutely, depended on their representation. When they made up just 15 percent of the workforce, they faced stereotyping, harassment, isolation, disproportionate performance pressures, and other disadvantages. But when they made up something like 35 percent of the workplace, they started shifting its culture in their favor by forming alliances and establishing a counterculture.

Decades of work in sociology, physics, and other disciplines have supported this idea. Small groups of people can indeed flip firmly established social conventions, as long as they reach a certain critical mass. When that happens, what was once acceptable can quickly become unacceptable, and vice versa. Two decades ago, most Americans opposed gay marriage, bans on public smoking, and the legalization of marijuana; now, these issues all enjoy majority support.

How big do minority groups have to get in order to trigger these tipping points? Is it something like 30 to 40 percent, as Kanter and others have suggested based on sociological observations? Or is it as low as 10 percent, as physicists have predicted using mathematical models that simulate social change?

After running a creative experiment, Damon Centola from the University of Pennsylvania says that the crucial threshold is more like 25 percent. That’s the likely tipping point at which minority views can overturn majority ones. “A lot of models have been developed, but they’re often people speculating in the dark, and writing equations without any data,” Centola says. “Our results fit better with the ethnographic data. It’s really exciting to me how clearly they resonate with Kanter’s work.”

Centola’s team recruited 194 volunteers, divided them into 10 groups, and made them play an online game in which they had to work together to create new social norms. In every round, the volunteers within each group were randomly paired up and shown a photo of a stranger. Without consulting each other, each person suggested a name that best matched the stranger’s face. At the end of every round, both names were revealed. The players earned 10 cents if they had offered the same name, and they lost 10 cents if they had entered different ones. Even though the players only ever interacted with one person at a time, as the game progressed, they quickly arrived at group-wide conventions, where everyone assigned the same name to each face.

At that point, Centola added groups of “activists” to each group. These rabble-rousers all suggested a different name for each face, in an attempt to overturn the established order. And Centola varied the number of activists from one group to the next.

He found that these newcomers were effective in changing minds only if they made up at least 25 percent of the total population. Anything less than that, and their suggestions never took off. Anything more than that, and their alternatives completely replaced the previous status quo. There was nothing in between.

This result matched the predictions from a mathematical model that Centola’s team created to simulate these kinds of interactions. “You see this clump of failures below 25 percent and this clump of successes above 25 percent,” Centola says. “Mathematically, we predicted that, but seeing it in a real population was phenomenal.”

“What I think is happening at the threshold is that there’s a pretty high probability that a noncommitted actor”—a person who can be swayed in any direction—“will encounter a majority of committed minority actors, and flip to join them,” says Pamela Oliver, a sociologist at the University of Wisconsin at Madison. “There is therefore a good probability that enough non-committed actors will all flip at the same time that the whole system will flip.”

Centola agrees. “People need enough reinforcement on a new social norm before they’ll switch,” he says. “Say you shake hands at every business meeting. If you fist-bump, you might seem weird, or trying to be edgy. But if enough people fist-bump, there’s now a feeling that you’re all on the same page. That feeling of riskiness holds people back, and the tipping point creates a group large enough that you’re more likely to meet people showing the same behavior.”

He stresses that the 25 percent figure isn’t universal, and will likely vary depending on the circumstances. Indeed, the stakes in his experiment were very low. Volunteers jostled over arbitrary norms, rather than, say, politically charged beliefs. And both the established group and the incoming activists had similar amounts of power—something that’s rarely the case in real life.

“What if the minority dissenters are people of color, whose voices have been structurally disadvantaged in many ways?” asks Hahrie Han, a political scientist at the University of California at Santa Barbara. “Or what about situations in which people have social identities anchoring their beliefs, such as people who are climate deniers? Thinking about these questions would help to extend the findings from this important and creative study.”

Centola made a start on simulating some of these dynamics by changing his mathematical model so that (virtual) communities were heavily incentivized to stick to their guns, and more resistant to the influence of activists. That did raise the tipping point—but only to 30 percent. “We still saw this strong critical-mass effect at well below 50 percent,” he says.

If this bears out in real life, the implication is that activist groups will be unsuccessful until suddenly, they’re not. “If a minority group is at 24 percent, their success in terms of affecting the population is the same as if they were at zero. You’re close to success but you can’t feel or see it,” says Centola. “But even if a single person changes the population from 24 percent to 25 percent,” the result would be very different.

This isn’t necessarily an uplifting message, Centola stresses. “It’s really important to be aware of how easily populations can be co-opted by people with an agenda,” he says. Russian-linked Facebook accounts bought a significant number of ads that targeted U.S. voters during the 2016 presidential election. The voter-profiling company Cambridge Analytica used information from millions of people on Facebook to create psychographic profiles, and then used those to target ads supporting Donald Trump’s 2016 campaign and the Brexit “Leave” campaign. The Chinese government has been seeding groups of activists into online communities to subtly shift discussions towards national pride, and to distract from collective grievances. (“We’re now looking at times in which these activists became more active to see if they reached this 25 percent threshold,” Centola says.)

“There are already a number of people out there who are gaming group dynamics in careful ways,” says Damien Williams, a philosopher at Virginia Tech University who studies the ethical implications of technology. “If they know what target numbers they have to hit, it’s easy to see how they could take this information and create a sentiment-manipulation factory.”

“It’s a little disquieting,” Williams adds. “It would very likely yield an environment in which we have to be a lot more careful about who’s moving us and how.”

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.