'One of the Greatest Discoveries in the History of Science' Hasn't Been Peer-Reviewed—Does It Matter?

How scientific communication is evolving in the age of the Internet
The South Pole Telescope and the BICEP (Background Imaging of Cosmic Extragalactic Polarization) Telescope at Amundsen-Scott South Pole Station (Reuters)

Big scientific discoveries—the kind that shift our view of the world and our place within it—don’t come along very often.

This week, though, one did.

New data seem to offer, for the first time, direct evidence of the entities Einstein predicted in his general theory of relativity: gravitational waves. Which is a finding that, if it holds up, sheds new light on nothing less than the origins of the universe. The discovery is, according to one expert, “an amazing achievement.” It is also, according to another, “one of the greatest discoveries in the history of science”—“a sensational breakthrough involving not only our cosmic origins, but also the nature of space.”

So, basically: This is big, you guys! Einstein big! Nature-of-space big! Big Bang-big!

There’s just one small thing, though. The findings shared this week also share a significant caveat: They haven’t yet been peer-reviewed. They are discoveries that are, as far as scientific institutionalism is concerned, provisional. They’re stuck in a kind of epistemological limbo—as information that has not yet been converted into fact, and data that have not yet been codified into knowledge. Official status: truthy.

***

Scientists are, like the rest of us, impatient. They are, much more often than the rest of us, justified in this. Imagine dedicating your career to learning something new about the mechanics of the world—the gravitational forces exerted on a cell membrane, the flappings of a bee’s wings, the earliest churnings of the cosmos—and then imagine actually finding that thing. Now imagine that, instead of doing what every impulse would guide you to do (share that news with everyone you know/share that news with everyone you don't know/shout that news from the rooftops or at least your Facebook page) … you are made to wait. And wait. And wait. Until, many months later, your work has been deemed acceptable for proper publication.

For many scientists, this holding pattern of human enthusiasm is one of the most salient facts of peer review, the painstaking process by which the discoveries of individual scientists are weighed and tested by fellow experts in the field. It's a process that is purposely inefficient and pointedly complex, one meant to distinguish scientific discoveries from every other kind. It is also, in an age when anyone—scientists included—can be a publisher, increasingly controversial. A 2011 report commissioned by the British House of Commons found that, while peer review "has always been regarded as crucial to the reputation and reliability of scientific research” and “continues to play an important role in ensuring that the scientific record is sound,” many scientists also find its constraints to be detrimental to their work and their ability to share it. (Some also doubt its worth more generally: There is, the report noted, “little solid evidence” about peer review’s overall efficacy.)

Peer review, though as a concept it dates back to the Scientific Revolution, is not a fixed feature of the scientific method. The peer-reviewed scientific journal, the Harvard historian Melinda Baldwin points out, “isn't nearly as old as most observers think it is.” While “many histories of journal publishing claim that we've had both journals and peer review since the Scientific Revolution,” she told me, the highly specialized scientific journals we’re familiar with today “didn't become the dominant way of communicating scientific findings until the 19th century.” And “it wasn't until the 20th that journals had to be peer-reviewed to be considered scientifically respectable.” The prestigious journal Nature, she notes, made an occasional practice of publishing non-peer-reviewed research articles up through 1973.

Which is all to say: Just as scientific knowledge evolves, so do the tools that help us earn it. The mechanisms we rely on to encode the truths of the physical world evolve along with everything else. 

***

This week’s big Big Bang announcement is akin to the 2012 announcement of the discovery of the Higgs boson—which was both similarly epic (“God particle,” etc.), and similarly publicized before the scientific community had given it its institutional ratification. Before proceeding through peer review and journal publication (which would culminate in a Nobel Prize for Peter Higgs and his colleague, Francois Englert), the discovery was rushed to market. The market being, in this case, the media and the public at large.

It took, in the end, about two months—a relatively quick turnaround—for the Higgs's "potential Huge Discovery" to be upgraded to “actual Huge Discovery.” Which lead, eventually, to awkward announcements like this one: "CERN's Higgs boson discovery passes peer review, becomes actual science.” To the lay reader, a headline like that would seem confusingly redundant. 

So what we have in the Big Bang news, basically, is the same thing we had with the Higgs news: a disconnect between traditional systems of scientific codification and newer, nimbler ones—a tension between publicity strategies within scientific institutions and beyond them. It's not as stark a dichotomy as journal publication-vs.-web publication ... but it's close. There are, after all, competing pressures at play: on the one hand, the pressure to be first; on the other, the pressure to be right. And the latter pressure, ideally, trumps the former. Take the announcement of cold fusion (or, more accurately, "cold fusion"); in 1989, the chemists Stanley Pons and Martin Fleischman announced—to great fanfare, and via a widely publicized press conference—that they had achieved nuclear fusion at room temperature. In a jar of water. This was hailed as "the greatest discovery since fire" ... until none of the chemists' colleagues were able to replicate their work. Now the whole thing is treated as a cautionary tale.

Presented by

Megan Garber is a staff writer at The Atlantic. She was formerly an assistant editor at the Nieman Journalism Lab, where she wrote about innovations in the media.

The Man Who Owns 40,000 Video Games

A short documentary about an Austrian gamer with an uncommon obsession

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register.

blog comments powered by Disqus

Video

The 86-Year-Old Farmer Who Won't Quit

A filmmaker returns to his hometown to profile the patriarch of a family farm

Video

Riding Unicycles in a Cave

"If you fall down and break your leg, there's no way out."

Video

Carrot: A Pitch-Perfect Satire of Tech

"It's not just a vegetable. It's what a vegetable should be."

Video

An Ingenious 360-Degree Time-Lapse

Watch the world become a cartoonishly small playground

Video

The Benefits of Living Alone on a Mountain

"You really have to love solitary time by yourself."

More in Technology

Just In