In Science, There Should Be a Prize for Second Place

Some scientific journals are defusing the fear of getting “scooped” by making it easier for scientists to publish results that have appeared elsewhere.

An empty winners' podium for first, second, and third place
Yagi Studio / Getty Images

It is the moment that most scientists fear: You learn that your competitors have done similar experiments to those that have occupied the last years of your life, and have published the results before you. In the jargon of research, you have been scooped.

In science, there are few prizes for second place. Your chances of publishing your own work are now limited, since most major scientific journals put a premium on “novelty.” That is, they only want to publish things that are new, and they’ll often reject papers whose discoveries have already appeared elsewhere. For the scooper, glory. For the scoopee, heartbreak, and the tedium of revisiting past experiments to somehow make them seem fresh.

“You end up investing yet more time and resources into what was essentially a fully formed research story ready to be shared with the world,” says Leon Van Eck from Augsburg University. “It can be very demotivating. It’s a feeling of being trapped in a hamster wheel, churning out data, none of which gets submitted quite in time to qualify as novel.”

Now, some journals are taking a stand. This Monday, the editors of PLOS Biology—the flagship journal of Public Library of Science, a nonprofit publisher—published an editorial saying that they are now willing to publish papers that were scooped less than six months ago. And in a clever bit of rebranding, they’re abandoning the word “scooped” altogether in favor of calling these “complementary” papers. “Just as summiting Everest second is still an incredible achievement, so too, we believe, is the scientific research resulting from a group who have (perhaps inadvertently) replicated the important findings of another group,” the editors wrote.

The fear of being scooped has pernicious and wide-ranging effects. It weakens the reliability of science, as researchers might favor quick and dirty experiments that will lead to many published papers over careful, methodical work that leads to rigorous results. It penalizes scientists for checking the work of their peers, instead of breaking new ground themselves. It stymies collaboration and the free exchange of ideas by making people nervous about even mentioning ideas to their colleagues. It keeps the public in the dark about ongoing, taxpayer-funded research, as researchers often will not let journalists report on work in progress, for fear that their competitors will catch wind. It spells trouble for work-life balance: One Nobel laureate reportedly tells his graduate students to work weekends on the basis that they’re putting in 40 percent more effort than someone working only weekdays.

For these reasons, the PLOS Biology announcement was met with nigh-universal approval on social media. “This changes EVERYTHING. Well done,” tweeted Veronique Miron at the University of Edinburgh. “I think this is the first time a scientific journal has ever made me cry,” tweeted Van Eck. “This will change the career trajectory for many a disillusioned scientist. Thank you.”

Ironically, the PLOS Biology announcement was itself scooped by another journal, ELife, which detailed a similar policy last July. The deputy editor, Eve Marder, wrote that editors wouldn’t penalize a paper if another one on a similar topic had been published a few weeks or months earlier. “We are seeing a trend toward the co-submission of papers from labs that choose not to compete but, rather, to jointly announce new findings,” she wrote. “This is a trend that we encourage.” The editors at EMBO Press have a similar policy.

This initiative comes at an important moment. In recent years, many scientists have worried about a so-called reproducibility crisis, where the pressure to produce new, eye-catching results has culminated in a lot of research that may not actually be true. These concerns have spawned a growing reform movement, whose members are pushing for practices that will make science more reliable—such as investing time in replicating the work of other teams.

The PLOS Biology editors argue that scooped—sorry, complementary—work is a kind of “organic replication.” After all, one team has effectively checked the work of another, albeit unintentionally. That should be a source of pride rather than shame. Both parties get independent confirmation that they were right. They should bump fists, rather than gnash teeth. “What’s perceived as a negative by the scientific community should be perceived as valuable research,” says Emma Ganley, the chief editor of PLOS Biology.

“If one of these groups was just a bit delayed for myriad reasons—reagents arriving late, personnel issues, an experiment simply not working, or perhaps a desire to repeat a few experiments for confirmation—it is difficult to justify providing all the accolades to the group who beat them to the finish by a hair,” says Piali Sengupta from Brandeis University.

Ganley tells me that her team has just formalized a policy that they had been informally following on a case-by-case basis. For example, in February 2014, the journal Science published a paper describing the structure of a protein called UbiA. (Proteins are important molecules with complicated three-dimensional shapes, which can be very difficult to figure out.) Another team had independently been trying to solve the same structure, and five months later, PLOS Biology published their results. “The authors had done more analyses and there was still a huge amount of value to this work,” says Ganley.

Similarly, in 2015, the journal PNAS published a paper describing the genome of a tardigrade—an adorable animal that can withstand implausibly extreme environments. Two years later, PLOS Biology published a second such genome, in a paper that contradicts some of the conclusions from the first one. “I suspect other journals would have passed on this for reasons of it not being the first,” says Ganley. The new policy simply codifies the team’s stance—at least for papers that were scooped within a six-month window. Beyond that, they are still open to considering each new case as it comes.

The policy is a boon to early-career scientists, who have recently started their own research groups. “It takes quite a bit of time to get the experiments going,” says Miron. “There is always an undercurrent of anxiety that your best idea will be scooped and that you’ll have nothing to show for many years of work, with profound consequences for you and your staff. Knowing that well-respected journals like PLOS Biology are improving the system is so important for supporting new investigators in an otherwise very difficult period.”

It’s good for science education, too. It means that researchers at smaller institutions that mostly teach undergraduates, “do not need to shy away from pursuing potentially high-impact research, out of fear that a bigger lab will beat them to the punch,” says Alison Pischedda from Barnard College. “This means that undergraduate researchers at all schools can potentially be involved in exciting research.”

Michael Hendricks from McGill University says the concept is great, but he’s skeptical about its execution. In particular, the PLOS Biology editors wrote that they hope that scooped researchers will use the journal’s six-month protection window to “fully support and potentially extend the results of the first article.” That, Hendricks says, “is an implied expectation that you do substantial further work.” It creates much the same problem as the current publishing system—scientists must scramble to do more experiments because someone else got their work out first.

Ganley says the line was poorly phrased, and their intention is exactly the opposite. They think of the six-month window as a chance for researcher to “complete the work they were intending to do, rather than having a panicked reaction to being scooped and submitting there and then,” she says. “The intention is not to hold people accountable for replicating every piece of a prior study.”

Hendricks also notes that the new policy might be difficult to enforce among reviewers—the scientists who look over new papers and decide if they’re worthy of publication. For example, PLOS publishes another journal—PLOS One—where editors and reviewers are explicitly told not to care about novelty. And yet, many do. But Ganley is optimistic about this, too. At PLOS Biology, every paper is handled by a pair of editors—an academic, and someone from the journal’s in-house team. If the former won’t hew to the new policy, the latter can overrule them.

“The way that scientific publishing fetishizes novelty is gross,” says Van Eck. “That’s the business model of tabloids, not scientific journals. I am elated that journals like PLOS Biology are finally recognizing the value of replication and reproducibility in scientific studies.”