The project began, in one telling, five years ago, in a castle that overlooks the Bavarian Alps, where three dozen of the world’s most successful and rivalrous earth scientists came together for a week of cloistered meetings.
They gathered, in part, out of embarrassment. For the past four decades, their field—the study of Earth’s natural phenomena, including its land, ocean, and climate—had boomed. Generations of young researchers who once would have become nuclear physicists or oil geologists instead pursued careers in glaciology and paleoclimatology. Governments, hoping to understand the dangers of global warming, had poured hundreds of millions of dollars into climate science. And the work was good. It gave humans a new way of seeing Earth: We learned to map the flow of the oceans, to chart the growth of continent-spanning glaciers, and to read the evidence left behind in lake mud and caves by ancient rainstorms, droughts, and hurricanes.
Which is, you know, nice. It’s fun to play weatherman for people who lived 1,000 years ago. Yet for all the immeasurable wonder and glory, and for all those millions of National Science Foundation dollars spent, there was one fundamental question on which climate scientists had not really made progress. It is among the field’s oldest and most purely scientific questions—it was first investigated in 1896 by Svante Arrhenius, a Nobel laureate—yet central to understanding modern, human-caused climate change. But for all that importance, climate scientists would have answered the question the same way five years ago as they would have in 1979. It is this trouble that brought the empiricists to Bavaria. They wanted a better answer to the question, which is: If you greatly increase the amount of carbon dioxide in the atmosphere, how hot will the planet get?