Editor’s note: An earlier version of this story presented an economic modeling assumption—the .01 chance of human extinction per year—as a vetted scholarly estimate. Following a correction from the Global Priorities Project, the text below has been updated.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. A new report from the U.K.-based Global Challenges Foundation urges us to take them seriously.
The nonprofit began its annual report on “global catastrophic risk” with a startling provocation: If figures often used to compute human extinction risk are correct, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
Yet the risk of human extinction due to climate change—or an accidental nuclear war, or a meteor—could be much higher than that. The Stern Review, the U.K. government’s premier report on the economics of climate change, assumed a 0.1-percent risk of human extinction every year. That may sound low, but it adds up when extrapolated to century-scale. Across 100 years, that figure would entail a 9.5 percent chance of human extinction.
And that number might even underestimate the risk. Another Oxford survey of experts from 2008 posited the annual extinction risk to be a higher figure, 0.2 percent. And the chance of dying from any major global calamity is also likely higher. The Stern Review, which supplies the 9.5-percent number, only assumed the danger of species-wide extinction. The Global Challenges Foundation’s report is concerned with all events that would wipe out more than 10 percent of Earth’s human population.
“We don’t expect any of the events that we describe to happen in any 10-year period. They might—but, on balance, they probably won’t,” Sebastian Farquhar, the director of the Global Priorities Project, told me. “But there’s lots of events that we think are unlikely that we still prepare for.”
For instance, most people demand working airbags in their cars and they strap in their seat-belts whenever they go for a drive, he said. We may know that the risk of an accident on any individual car ride is low, but we still believe that it makes sense to reduce possible harm.
So what kind of human-level extinction events are these? The report holds catastrophic climate change and nuclear war far above the rest, and for good reason. On the latter front, it cites multiple occasions when the world stood on the brink of atomic annihilation. While most of these occurred during the Cold War, another took place during the 1990s, the most peaceful decade in recent memory:
In 1995, Russian systems mistook a Norwegian weather rocket for a potential nuclear attack. Russian President Boris Yeltsin retrieved launch codes and had the nuclear suitcase open in front of him. Thankfully, Russian leaders decided the incident was a false alarm.
Climate change also poses its own risks. As I’ve written about before, serious veterans of climate science now suggest that global warming will spawn continent-sized superstorms by the end of the century. Farquhar said that even more conservative estimates can be alarming: UN-approved climate models estimate that the risk of six to ten degrees Celsius of warming exceeds 3 percent, even if the world tamps down carbon emissions at a fast pace. “On a more plausible emissions scenario, we’re looking at a 10-percent risk,” Farquhar said. Few climate adaption scenarios account for swings in global temperature this enormous.
Other risks won’t stem from technological hubris. Any year, there’s always some chance of a super-volcano erupting or an asteroid careening into the planet. Both would of course devastate the areas around ground zero—but they would also kick up dust into the atmosphere, blocking sunlight and sending global temperatures plunging. (Most climate scientists agree that the same phenomenon would follow any major nuclear exchange.)
Yet natural pandemics may pose the most serious risks of all. In fact, in the past two millennia, the only two events that experts can certify as global catastrophes of this scale were plagues. The Black Death of the 1340s felled more than 10 percent of the world population. Eight centuries prior, another epidemic of the Yersinia pestis bacterium—the “Great Plague of Justinian” in 541 and 542—killed between 25 and 33 million people, or between 13 and 17 percent of the global population at that time.
No event approached these totals in the 20th century. The twin wars did not come close: About 1 percent of the global population perished in the Great War, about 3 percent in World War II. Only the Spanish flu epidemic of the late 1910s, which killed between 2.5 and 5 percent of the world’s people, approached the medieval plagues. Farquhar said there’s some evidence that the First World War and Spanish influenza were the same catastrophic global event—but even then, the death toll only came to about 6 percent of humanity.
The report briefly explores other possible risks: a genetically engineered pandemic, geo-engineering gone awry, an all-seeing artificial intelligence. Unlike nuclear war or global warming, though, the report clarifies that these remain mostly notional threats, even as it cautions:
[N]early all of the most threatening global catastrophic risks were unforeseeable a few decades before they became apparent. Forty years before the discovery of the nuclear bomb, few could have predicted that nuclear weapons would come to be one of the leading global catastrophic risks. Immediately after the Second World War, few could have known that catastrophic climate change, biotechnology, and artificial intelligence would come to pose such a significant threat.
So what’s the societal version of an airbag and seatbelt? Farquhar conceded that many existential risks were best handled by policies catered to the specific issue, like reducing stockpiles of warheads or cutting greenhouse-gas emissions. But civilization could generally increase its resilience if it developed technology to rapidly accelerate food production. If technical society had the power to ramp-up less sunlight-dependent food sources, especially, there would be a “lower chance that a particulate winter [from a volcano or nuclear war] would have catastrophic consequences.”
He also thought many problems could be helped if democratic institutions had some kind of ombudsman or committee to represent the interests of future generations. (This strikes me as a distinctly European proposal—in the United States, the national politics of a “representative of future generations” would be thrown off by the abortion debate and unborn personhood, I think.)
The report was a joint project of the Centre for Effective Altruism in London and the Future of Humanity Institute at the University of Oxford. It can be read online.