One of the sources of academic disdain for popular health media is its reputation for sensationalism and exaggeration. "If You've Ever Eaten Pizza, You'll Want to Read About the Toxin That Is Pretty Certainly Ravaging Us From the Bowels Outward" or "This Common Household Item Is Definitely Killing You, Says a New Study"—when the actual study only posited that a "possible association may potentially exist" between, say, exposure to antibacterial soap and liver disease in a handful of mice who were exposed to more antibacterial soap than any human could ever dream of using, even if they washed their hands literally every time they went to the bathroom.
Petroc Sumner, a professor of psychology at Cardiff University in Wales, has been trying to pinpoint exactly where exaggeration in science reporting comes from. At what level, in the ladder from lab data to news headline, are most inaccuracies introduced?
Yesterday Sumner and colleagues published some important research in the journal BMJ that found that a majority of exaggeration in health stories was traced not to the news outlet, but to the press release—the statement issued by the university's publicity department.
"The framing of health-related information in the national and international media has complex and potentially powerful impacts on healthcare utilization and other health-related behavior," Sumner and colleagues write. "Although it is common to blame media outlets and their journalists for news perceived as exaggerated, sensationalized, or alarmist, most of the inflation detected in our study did not occur de novo in the media but was already present in the text of the press releases."
The goal of a press release around a scientific study is to draw attention from the media, and that attention is supposed to be good for the university, and for the scientists who did the work. Ideally the endpoint of that press release would be the simple spread of seeds of knowledge and wisdom; but it's about attention and prestige and, thereby, money. Major universities employ publicists who work full time to make scientific studies sound engaging and amazing. Those publicists email the press releases to people like me, asking me to cover the story because "my readers" will "love it." And I want to write about health research and help people experience "love" for things. I do!
Across 668 news stories about health science, the Cardiff researchers compared the original academic papers to their news reports. They counted exaggeration and distortion as any instance of implying causation when there was only correlation, implying meaning to humans when the study was only in animals, or giving direct advice about health behavior that was not present in the study. They found evidence of exaggeration in 58 to 86 percent of stories when the press release contained similar exaggeration. When the press release was staid and made no such errors, the rates of exaggeration in the news stories dropped to between 10 and 18 percent.
Even the degree of exaggeration between press releases and news stories was broadly similar.
Sumner and colleagues say they would not shift liability to press officers, but rather to academics. "Most press releases issued by universities are drafted in dialogue between scientists and press officers and are not released without the approval of scientists," the researchers write, "and thus most of the responsibility for exaggeration must lie with the scientific authors."
In an accompanying editorial in the journal, Ben Goldacre, author of the book Bad Science, noted that bad news tends to generate more coverage than good, and that less rigorous observational studies tend to generate more coverage than robust clinical trials, probably due to the applicability of the subject matter to lay readers.
Guidelines for best practices already exist among academic journals and institutional press officers, he notes, "but these are routinely ignored." So Goldacre corroborates Sumner's argument for accountability: that academics should be held responsible for what's said in the universities' press releases that publicize said academics' research. The press releases will often be read much more widely than the actual journal article, yet many academics take little to no interest in them. Instead, writing an accurate press release should be considered part of the scientific publication process.
"This is not a peripheral matter," writes Goldacre, citing research that has found that media coverage has important effects on people's health behaviors and healthcare utilization, and even on subsequent academic research.
He notes that Sumner was "generous" to avoid naming particular offenders in this study. But Sumner did share with me some of the less egregious examples by email. In one case, a journal article read: "This observational study found significant associations between use of antidepressant drugs and adverse outcomes in people aged 65 and older with depression." The press release went on to read: "New antidepressants increase risks for elderly." There are of course many reasons why taking antidepressants would be associated with worse outcomes. For example, people with worse symptoms to begin with are more likely to take antidepressants.
"It is very common for this type of thing to happen," said Sumner, "probably partly because the causal phrases are shorter and just sound better. There may be no intention to change the meaning."
There is also, almost always, an implied causal relationship when reporting on a correlation. Every time we note a correlation in anything we publish on this site, at least one of our fair commenters will jump to point out that correlation is not causation. That comment may as well just auto-populate on any article that involves science. Which is fine—even though we're deliberate in not mistaking the relationships for causal—because why even report on a correlation if you don't mean to imply in some way that there is a chance there could be causation?
I asked Sumner how he felt about the press release for his study, because I thought that would be kind of funny.
"We were happy with our press release," he said. "It seemed to stick closely to the article and not claim causal relationships, for example, where we had not."
Appropriately reported scientific claims are a necessary but not sufficient condition in cultivating informed health consumers, but misleading claims are sufficient to do harm. Since many such claims originate within universities, Sumner writes, the scientific community has the ability to improve this situation. But the problem is bigger than a lack of communication between publicists and scientists. The blame for all of this exaggeration is most accurately traced back, according to the researchers, to an "increasing culture of university competition and self-promotion, interacting with the increasing pressures on journalists to do more with less time.”
In his ivory tower, in his ivory cap and gown, the academic removes his ivory spectacles just long enough to shake his head at the journalists who are trying to understand his research. The headlines and tweets are wretched misappropriations. Wretched! The ink-stained journalists shake their ink-stained heads in time at the detached academics, at the irrelevance of work written in jargon behind giant paywalls where it will be read by not more than five to seven people, including the nuclear families of the researchers. The families members who, when the subject of the latest journal article comes up at dinner, politely excuse themselves.
But the divide is narrowing every day.
"Our findings may seem like bad news, but we prefer to view them positively," Sumner and colleagues conclude. "If the majority of exaggeration occurs within academic establishments, then the academic community has the opportunity to make an important difference to the quality of biomedical and health-related news."