When I was asked to speak at the Los Angeles installment of the March for Science, a vision leapt unbidden to my mind: thousands of scientists and science-lovers gathered in Pershing Square, carrying whiteboards and graphs, arguing with each other about how to properly interpret the data they were showing.
Presumably the real march won’t be like that. But nothing would be more characteristic of how scientists behave in the wild than a bit of good-natured disagreement. Indeed, the March for Science itself has notably stirred up some controversies—over the fear that it turns science into a partisan special interest, over worries that it has tried too little (or too hard) to promote diversity, over a concern that scientists shouldn’t descend to the tawdry realities of politics.
This tendency toward argumentativeness is, as computer scientists like to say, not a bug, it’s a feature. And it can be traced back to one of science’s core values, which it shares with democracy: the idea that nobody has all the final answers.
Science and democracy are two very different things. (Whenever someone wants to liven things up at a science conference by taking a straw poll of the participants on some contentious issue, a curmudgeon in the back will inevitably grumble “Science isn’t a democracy!”) Science is a method for learning things about the natural world, while democracy is a way we decide how to govern ourselves.
But notwithstanding their important differences, science and democracy share a crucial heritage. Though prefigured in many ways in the classical world, they each began to flourish in the early modern period in the 16th and 17th centuries. Among the many ideas that were mixed together to create these impressive structures was that of fallibilism: the simple conviction that we can always be wrong.
The centrality of this idea to democracy should be clear. By voting for our representatives, and then doing it again some time later, we are acknowledging that there is no one perfect ruler, no philosopher-king with all possible wisdom. If there were, it might make sense to turn over all power to them, as Plato advocated. Instead, we elect leaders on a provisional basis, reserving the right to change our minds (and sometimes even imposing term limits to prevent ourselves from being swept up with enthusiasm for a particularly charismatic politician). The United States Constitution proudly features checks and balances that make it hard for any single person or institution to wield too much power.
This principle of fallibilism is less clear, though equally important, to the practice of science. We have our wise heroes, our Newtons, Darwins, and Einsteins. But they are not infallible. There is no Science Pope to whom we can turn for final adjudication of sticky research questions.
Precisely the opposite: Science proceeds by showing how our wise heroes were, in larger or smaller ways, mistaken. Einstein overthrew Newton’s cosmos, and modern biologists are improving upon Darwin all the time. You may have a brilliant theory of the universe, but if it is contradicted by an experiment performed by a lowly graduate student, the data wins.
Science and democracy, in other words, both upend the ancient pyramids of power and knowledge: Answers bubble up from the bottom, rather than being imposed from the top.
This similarity between science and democracy is worth pointing out because it’s not an easy or obvious one to rally around, and that makes it fragile. Few people march down the streets carrying signs proclaiming “I Could Very Well Be Wrong!” But if we refuse to acknowledge the demands of that principle, both science and democracy will be threatened.
The temptation to appeal to authority or put our fates in the hands of a wise few—kings, popes, strongmen—is a powerful one. In science, we elevate most esteemed practitioners, ascribing to them an almost alien degree of intelligence and insight. In democracy, we can’t help but think that our problems could be readily dispatched with if the right person could simply impose their will and break the gridlock of our current system.
It’s worth resisting these temptations. As uninspiring as it may be to admit that nobody has all the answers, there are good reasons why the humble principle of fallibilism plays such a large role in the most powerful ideas of modernity.
It’s not simply that it’s true, as a matter of principle, that anyone can be wrong. It’s that foregrounding this idea—holding up fallibility as a foundational piece of our worldview, rather than begrudgingly admitting that we have to live with it—makes it much easier to correct mistakes when they do occur (as they inevitably must). And even better, it helps us recognize mistakes that we might otherwise ignore.
This is an area where democracy might be able to learn something from the practice of science. It’s not only that we finite human beings can always be wrong, it’s that we are often wrong in predictable, systematic ways. Science recognizes that people are not perfect reasoning machines. We have biases, intuitions, tendencies, blind spots. Much of the methodology of science, from double-blind studies to anonymous refereeing, is explicitly dedicated to correcting for these human foibles.
Government, where decisions made in a moment can affect millions of people for a lifetime, needs constant reminders of its fallibility. A big part of that has to be a proper respect for the methods of science, as well as for its substantive discoveries. Psychologists assure us that human beings have a strong desire to accept things as true because we want them to be true, not only because they are the best explanation for what we observe. In the hands of policy-makers, that natural tendency can have deadly consequences. Science has developed impressive (though not infallible) techniques for correcting for such biases; our government could stand to do a bit better.
The most obvious thing that our government can do, and our society along with it, is to help science to flourish in its own right, and accept what it has to teach us. Sometimes research tells us answers we don’t want to hear—that human activity is warming the planet, that we share a common ancestor with other living beings here on Earth, or that the universe is winding down toward its ultimate heat death. We need the courage to face up to the truth, whatever it turns out to be.
The practice of science is one of those human activities that elevates our lives a bit above merely surviving from day to day. Our brains, as wonderfully imperfect as they are, didn’t evolve to solve problems in quantum mechanics or biochemistry. But we haven’t been content to use our intelligence merely to scrounge up food and shelter. We’ve turned our attention to the stretches of the cosmos, the depths of time, and the mysteries of our own consciousness, and returned with remarkable discoveries.
We want to hear what you think about this article. Submit a letter to the editor or write to email@example.com.