In general, Americans believe in science.
A new report by the Pew Research Center found that 79 percent of the 2,000 adults surveyed think science has “made life easier for most people.” Seventy-one percent think that investment in science ultimately pays off.
But on certain hot-button scientific topics of our day, Pew found wide gaps between what the public believes, and what scientists believe. You can probably guess which ones.
Genetically modified foods: 88 percent of scientists say they’re "generally safe" to eat; 37 percent of the public agrees.
Vaccines: 86 percent of scientists believe they should be required in childhood, compared to 68 percent of the public.
Climate change: 94 percent of scientists say it’s a “very serious" or "somewhat serious" problem; 65 percent of the public agrees. 87 percent of scientists blame humans; 50 percent of the public does too.
Evolution: 98 percent of scientists say they believe humans evolved over time, compared to 65 percent of the public.
There were also large disparities on issues like whether it’s safe to eat foods grown with pesticides (scientists: 68 percent; public: 28 percent), and whether the world’s growing population will be a problem (scientists: 82 percent; public: 59 percent).
If you’ve ever dipped a toe into the toxic water of a comments section, these numbers aren’t particularly surprising. For every science article published online, there’s someone ready to jump in and call it pseudoscience. (Michael Shermer in Scientific American has written that “the term ‘pseudoscience’ is subject to adjectival abuse against any claim one happens to dislike for any reason.”) But the numbers also show that most people trust and value science in a general sense, they just aren’t aligned with it so much on these specific issues.
There’s certainly room for debate in science, and the process of answering a question almost always extends beyond the scope of a single study. The problem is that evidence can’t be refuted by just saying “no,” but facts often have little bearing on how someone feels. That climate change is real (and caused by humans), and that vaccines are safe, are two of the most evidence-heavy, backed-up statements we have in modern science. Whether people believe them may have nothing to do with whether they trust scientists or not.
“There is this really strong conventional wisdom that the U.S. is experiencing some kind of creeping anti-science sensibility in the public, and this explains why we have conflicts over things like climate change or evolution,” says Dan Kahan, a law and psychology professor at Yale Law School. “It’s a mistake to think that has to do with disagreement about the authority and value of science in our society.”
The Pew numbers show the public’s support for science, he says, even though the commentary in the report emphasizes that the percentage of people who think science has a positive effect on society decreased slightly from the last time Pew did this survey in 2009. Kahan thinks that focus is playing into the narrative of distrust.
“It’s almost as if they don’t want to pop the conventional wisdom balloon with the needle of their own data,” he says, adding, “If we’re going to get really anxious that 4 percent less of the public thinks science is the greatest thing since sliced bread, what are we doing with the fact that [13 percent of scientists don’t think humans are responsible for climate change]?”
If not distrust, then what accounts for the gaps? Some people might not be aware of what scientists think. In the Pew report, 37 percent of people said they didn’t think scientists agreed on climate change, and 67 percent thought scientists don’t have a clear understanding of GMOs’ health effects. Or, they think the scientists actually support their beliefs. This is particularly true of climate change, Kahan says.
“People on both sides of the issue think science is on their side. It’s like when nations at war each think God is on their side, and they think the other side is godless.”
In the time of the Internet, someone can find evidence (real or not) to support almost any belief he wants. There’s an understandable bias toward valuing evidence that reinforces already-held beliefs: Kahan’s research has shown that people tend to ascribe more legitimacy to the experts who agree with them.
“The reasons people decide to believe things are complicated,” says Eula Biss, a professor of English at Northwestern University and the author of On Immunity, a book examining why people fear vaccines. “With vaccines, some people are primed to be suspicious because of what they know about the historical relationship between the medical establishment and women, or what they know about the history of corruption around pharmaceutical companies. People are drawing on real knowledge, but they’re allowing the answer to one question to be the answer to another.”
For their part, scientists in the Pew survey faulted the media and the public itself for the existence of these gaps. The “public doesn’t know much about science” was reported as a major problem by 84 percent of scientists, and 79 percent considered “news reports don’t distinguish well-founded findings” a major problem. About half of scientists said oversimplification by the media and a public that expects solutions too quickly were major problems.
Fair enough. The translating of dense, precise scientific studies into digestible, clickable news stories is a tricky business. When a publication mistakenly says a single study “proves” something, or, heaven forbid, implies causation where there is merely correlation, those who know better are eager to jump in and point out the mistake. And it probably doesn’t help the publications’ reputations as legitimate sources of information. Of course, no matter how careful a writer is to say “associated with,” to transparently point out small sample sizes, to repeat the scientists’ claim that “more research is needed,” you’ll still get commenters crying “pseudoscience.”
But the clarity, accuracy, and availability of information, while important, is not a magic elixir to change hearts and minds. For example, in a recent study, telling people that the flu vaccine doesn’t cause the flu made people less likely to vaccinate, even if they accepted the information as true. (Though, for what it’s worth, despite the recent hubbub over anti-vaxers, vaccination rates in the U.S. are still very high.) And Kahan says that asking people whether they believe in evolution, as Pew did, has nothing to do with how well they understand the theory.
“It doesn’t measure science literacy, it measures whether you’re religious,” he says. “It’s just an expression of identity.”
Adding to the puzzle is the notion that stories can sometimes be more powerful than data. As Vanessa Wamsley previously wrote in The Atlantic, a personal anecdote from a friend can feel more immediate and important than, say, a statement from a government agency.
Biss told me about a story she heard from a reader of her book, who changed her mind in favor of vaccination after she had a child with a birth defect who was “profoundly vulnerable to any respiratory illness,” Biss said. “And her baby died. It’s an incredibly heartbreaking story, but after she lost that baby, she was really open to thinking differently about medicine … Information alone is not going to do it. Something else has to be given to you that changes your willingness to hear that information.”
“Bombarding people with knowledge doesn’t help,” Kahan says. And he points out that the scientists’ beliefs about where the public’s beliefs are coming from are similarly not completely knowledge-based, for the aforementioned reasons. “They’re committed to believing what they’re going to believe, just like the public is… It doesn’t do anything to explain things to people, but here I am just explaining the facts over and over again.” He laughs. “Maybe the joke’s on me.”
“As an educator, I just cannot accept that,” Biss says, of the idea that information can't change minds. (As a journalist, I, too, would rather not accept that.) “I really have a lot of faith in people’s ability to not just learn, but change. Part of what my book is about is changing my mind.”
In communicating science, there’s personal narratives and there’s facts, but there’s also analysis of the facts, and the philosophies behind that analysis. Biss thinks that where facts alone might not change someone’s mind, those other things might. Of course, not every article can be a deep analysis, sometimes you just need to say what happened. But people with different values and beliefs might interpret the same straightforward article very differently.
“We build philosophical structures, and when we encounter information, we plug it into those structures,” Biss says. Maybe building up another structure around scientific evidence, and putting it in context—not just the context of other research, but the historical, social, and philosophical context—could reach some of the people in the gap.
That’s a tall order, admittedly, for the people doing the writing, and probably unattainable a lot of the time. Still, “I don’t think it’s time for us to throw up our hands and say nobody’s listening,” Biss says.