Retractions of research papers have soared in recent years, which demonstrates how unethical scientists, like financial criminals, cut corners to tell people what they want to hear
Abode of Chaos/FlickrTom Bartlett at the Chronicle of Higher Education, outlining the techniques of the Dutch psychology professor Diederik Stapel who has admitted fabricating data, makes scientific fraud sound a lot like Madoff-style financial deception: both include social networking, stonewalling disclosure, indignation when questioned. The Ponzi Schemer and data fabricator share with other forms of confidence artists a gift for recognizing the stories that people would like to hear, for example that messy surroundings increase racial prejudice against their residents. Bernard Madoff's story was tailored precisely to skeptical investors who questioned others' claims of unusually high yields. In a just universe, shouldn't it be possible to achieve more modest goals consistently? That's a plausible but wrong hypothesis, and some (though not all) experienced investors fell for it.
There's another parallel: both Bernard Madoff's and Diederik Stapel's numbers were too good. Madoff's returns were actually lower on average than those of some legitimate hedge fund operators, but those can experience wild swings. Most of John Paulson's investors have remained loyal despite recent heavy losses. Real markets, like the weather, have spikes and anomalies. According to the Associated Press report of the case, Stapel's own graduate students blew the whistle when they found his data "too perfect to be true."
The science journalist Eugenie Samuel Reich has described a different style of falsification in physical science by Jan Hendrik Schön, the fallen star of Bell Labs:
It came to light that he had duplicated data between different research papers when the context for the data would be different. So he was representing different experiments using exactly the same data sets. Some of the noise that he had introduced to make them look a little more realistic was identical, which was not something you would expect to happen in nature and was really a sign of human artifice.The bad news is that it may be hard to identify statistically sophisticated fraud as such:
I think that a more meticulous fraudster would have got a lot further, and I don't think they would have been exposed in the dramatic way that Schön was. Certainly there would have been problems with the reproducibility of his data, because that's not something that a fraudster would be able to second-guess, even someone who was very meticulous. But the reproducibility problems that did exist in Schön's data were not sufficient for him to be outed as a fraud; they simply led people to think that he must have a very clever method that he wasn't sharing with others. So I think things would have stalled in that situation for many months, and possibly years, longer than they did.So there is a risk that exposure, far from ending financial and scientific fraud, will lead to new and more sophisticated ways to work around new safeguards. And the Stapel case is far from isolated. Retractions of scientific papers have soared in 9 years from 22 to 339, and the more prestigious the journal, the greater the number of retractions -- possibly because those papers receive more attention, but also because the rewards of acceptance are so much greater. The New York Times reports that 70 percent of psychology researchers responding anonymously to a survey admitted "cutting some corners" in reporting data.
There is still one big difference between Ponzi tactics and scientific deception. We don't know how many money managers may have falsified books to keep solvent during a crisis and made enough money when markets rebounded to cover the shortfall. But if a fraudulent scientific result is important enough to be notable, as Eugenie Reich's comments suggest, the researcher's reputation will eventually suffer even if wrongdoing can't be proved. That remains the trump card of honest science.