Vaccine Myth-Busting Can Backfire
For people who mistrust vaccination, learning the facts may make the problem worse.
“Facts are stubborn things,” John Adams said in 1770, delivering his defense of the British soldiers on trial for the murders of the Boston Massacre.
“As John Adams said, facts are stupid things,” Ronald Reagan said 218 years later, delivering his speech at the Republican National Convention in New Orleans.
Transcripts of the proceedings show that Reagan caught and quickly corrected his slipup—“stubborn things, I should say,” he added, to audience laughter. But the misquote may have also been a sharp, if accidental, insight on the part of the former president. A new study published earlier this week in the journal Vaccine found that when it comes to vaccination, a subject rife with myths and misperceptions, knowing the facts may not really be all that effective in winning over anti-vaxxers—and, in some cases, may even do more harm than good.
Last year, only slightly more than 40 percent of U.S. adults got vaccinated against the flu—a number that could likely be explained, at least in part, by the persistent (and wrong) belief that the shot, which contains only inactivated viruses, can actually give people the flu. Despite efforts by the Food and Drug Administration, the Centers for Disease Control and Prevention, and even CVS to debunk the idea, it remains stubbornly pervasive. But according to the Vaccine study, all that myth-busting may be for nothing, anyway: The study found that when people concerned about side effects of the flu shot learned that it couldn’t cause the flu, they actually became less willing to get it.
Researchers surveyed a nationally representative sample of 1,000 Americans on their level of concern about vaccine side effects (around a quarter said they were “extremely concerned” or “very concerned”) before dividing them into three groups: one that read a message adapted from the CDC debunking the myth that people can contract the flu from the vaccine, one that read about the danger of the flu, and one that received no additional information. Afterwards, participants answered questions about the safety of the vaccine and whether or not they intended to get it for the coming flu season.
Around 43 percent still said they thought that the flu vaccine could cause the flu, though people who read the myth-busting message were the least likely of the three groups to say so. Among the people who had expressed the most concern about side effects, though, reading the correction tended to push them further in the opposite direction—after learning that the virus in the vaccine couldn’t cause the flu, they were less inclined to get it.
“The corrections we tested were effective at reducing misperceptions—which seems great, until you get to the behavioral outcomes and people say they’re less likely to vaccinate, rather than more,” said study co-author Brendan Nyhan, a professor at Dartmouth College.
The study built on previous research from Nyhan and Exeter University’s Jason Reifler, published earlier this year in the journal Pediatrics, that found a nearly identical effect when parents were exposed to information about the measles-mumps-rubella (MMR) vaccine. After reading that the MMR vaccine wasn’t dangerous to their children, the Pediatrics study found, the most concerned parents were less likely that before to say they would vaccinate their children.
Both political scientists, Nyhan and Reifler have spent the past several years studying what they call the “backfire effect,” or the idea that when presented with information that contradicts their closely-held beliefs, people will become more convinced, not less, that they’re in the right. In one study, when staunch conservatives read information refuting the idea that the U.S. found weapons of mass destruction in Iraq, they tended to believe more firmly than before that it was true; the researchers saw similar effects in studies correcting the notion that President Obama is Muslim and the claim that “death panels” were a part of healthcare reform.
From there, vaccination seemed like a logical next step in their research, Nyhan said: “Vaccines aren’t a partisan or ideological issue, but they’re controversial. They bring up issues of identity and tribalism that feel a lot like politics,” he explained. “I have kids, and talking about vaccines on the playground is like bringing up religion. It’s very weird and delicate and controversial.”
Though the vaccine studies have yielded results subtly different from the “backfire effect”—people were willing to accept new information as true, even when it had no effect on what they did in the end—Nyhan believes that the same sort of mental gymnastics is likely at work across both areas: reactance, the psychological phenomenon in which persuading people to accept certain idea can push them in the opposite direction.
“Think of a teenager when you tell them not to do something. That kind of response that we’re describing is going to likely be on the most controversial issues,” Nyhan said.
But the effect, the researchers explained, is more one of self-preservation than pure contrariness. “When your sense of self and your worldview are challenged, you need to have a defense mechanism in place. It’s much easier to say ‘This information is wrong’ than to say, ‘How I view the world turned out not to be correct,’” Reifler said.
So if correcting misinformation doesn’t work—what does?
“There’s often a temptation to think that the right messaging will be a silver bullet that will magically overcome vaccine resistance. I think that’s unlikely,” Nyhan said. “We’re not saying that the CDC or doctors or anyone else shouldn’t be setting the record straight about the safety of the flu shot. They should. But we should think about how to do that most effectively.”
And facts, it seems—stupid, stubborn things that they are—might not be the answer.