Breaking the Global-Warming Gridlock
Both sides on the issue of greenhouse gases frame their arguments in terms of science, but each new scientific finding only raises new questions—dooming the debate to be a pointless spiral. It's time, the authors argue, for a radically new approach: if we took practical steps to reduce our vulnerability to today's weather, we would go a long way toward solving the problem of tomorrow's climate
In the last week of October, 1998, Hurricane Mitch stalled over Central America, dumping between three and six feet of rain within forty-eight hours, killing more than 10,000 people in landslides and floods, triggering a cholera epidemic, and virtually wiping out the economies of Honduras and Nicaragua. Several days later some 1,500 delegates, accompanied by thousands of advocates and media representatives, met in Buenos Aires at the fourth Conference of the Parties to the United Nations Framework Convention on Climate Change. Many at the conference pointed to Hurricane Mitch as a harbinger of the catastrophes that await us if we do not act immediately to reduce emissions of carbon dioxide and other so-called greenhouse gases. The delegates passed a resolution of “solidarity with Central America” in which they expressed concern “that global warming may be contributing to the worsening of weather” and urged “governments, ... and society in general, to continue their efforts to find permanent solutions to the factors which cause or may cause climate events.” Children wandering bereft in the streets of Tegucigalpa became unwitting symbols of global warming.
But if Hurricane Mitch was a public-relations gift to environmentalists, it was also a stark demonstration of the failure of our current approach to protecting the environment. Disasters like Mitch are a present and historical reality, and they will become more common and more deadly regardless of global warming. Underlying the havoc in Central America were poverty, poor land-use practices, a degraded local environment, and inadequate emergency preparedness—conditions that will not be alleviated by reducing greenhouse-gas emissions.
At the heart of this dispiriting state of affairs is a vitriolic debate between those who advocate action to reduce global warming and those who oppose it. The controversy is informed by strong scientific evidence that the earth’s surface has warmed over the past century. But the controversy, and the science, focus on the wrong issues, and distract attention from what needs to be done. The enormous scientific, political, and financial resources now aimed at the problem of global warming create the perfect conditions for international and domestic political gridlock, but they can have little effect on the root causes of global environmental degradation, or on the human suffering that so often accompanies it. Our goal is to move beyond the gridlock and stake out some common ground for political dialogue and effective action.
In politics everything depends on how an issue is framed: the terms of debate, the allocation of power and resources, the potential courses of action. The issue of global warming has been framed by a single question: Does the carbon dioxide emitted by industrialized societies threaten the earth’s climate? On one side are the doomsayers, who foretell environmental disaster unless carbon-dioxide emissions are immediately reduced. On the other side are the cornucopians, who blindly insist that society can continue to pump billions of tons of greenhouse gases into the atmosphere with no ill effect, and that any effort to reduce emissions will stall the engines of industrialism that protect us from a Hobbesian wilderness. From our perspective, each group is operating within a frame that has little to do with the practical problem of how to protect the global environment in a world of six billion people (and counting). To understand why global-warming policy is a comprehensive and dangerous failure, therefore, we must begin with a look at how the issue came to be framed in this way. Two converging trends are implicated: the evolution of scientific research on the earth’s climate, and the maturation of the modern environmental movement.
Since the beginning of the Industrial Revolution the combustion of fossil fuels—coal, oil, natural gas—has powered economic growth and also emitted great quantities of carbon dioxide and other greenhouse gases. More than a century ago the Swedish chemist Svante Arrhenius and the American geologist T. C. Chamberlin independently recognized that industrialization could lead to rising levels of carbon dioxide in the atmosphere, which might in turn raise the atmosphere’s temperature by trapping solar radiation that would otherwise be reflected back into space—a “greenhouse effect” gone out of control. In the late 1950s the geophysicist Roger Revelle, arguing that the world was making itself the subject of a giant “geophysical experiment,” worked to establish permanent stations for monitoring carbon-dioxide levels in the atmosphere. Monitoring documented what theory had predicted: atmospheric carbon dioxide was increasing.
In the United States the first high-level government mention of global warming was buried deep within a 1965 White House report on the nation’s environmental problems. Throughout the 1960s and 1970s global warming—at that time typically referred to as “inadvertent modification of the atmosphere,” and today embraced by the term “climate change”—remained an intriguing hypothesis that caught the attention of a few scientists but generated little concern among the public or environmentalists. Indeed, some climate researchers saw evidence for global cooling and a future ice age. In any case, the threat of nuclear war was sufficiently urgent, plausible, and horrific to crowd global warming off the catastrophe agenda.
Continued research, however, fortified the theory that fossil-fuel combustion could contribute to global warming. In 1977 the nonpartisan National Academy of Sciences issued a study called Energy and Climate, which carefully suggested that the possibility of global warming “should lead neither to panic nor to complacency.” Rather, the study continued, it should “engender a lively sense of urgency in getting on with the work of illuminating the issues that have been identified and resolving the scientific uncertainties that remain.” As is typical with National Academy studies, the primary recommendation was for more research. [an error occurred while processing this directive] In the early 1980s the carbon-dioxide problem received its first sustained attention in Congress, in the form of hearings organized by Representative Al Gore, who had become concerned about global warming when he took a college course with Roger Revelle, twelve years earlier. In 1983 the Environmental Protection Agency released a report detailing some of the possible threats posed by the anthropogenic, or human-caused, emission of carbon dioxide, but the Reagan Administration decisively downplayed the document. Two years later a prestigious international scientific conference in Villach, Austria, concluded that climate change deserved the attention of policymakers worldwide. The following year, at a Senate fact-finding hearing stimulated by the conference, Robert Watson, a climate scientist at NASA, testified, “Global warming is inevitable. It is only a question of the magnitude and the timing.”
At that point global warming was only beginning to insinuate itself into the public consciousness. The defining event came in June of 1988, when another NASA climate scientist, James Hansen, told Congress with “ninety-nine percent confidence” that “the greenhouse effect has been detected, and it is changing our climate now.” Hansen’s proclamation made the front pages of major newspapers, ignited a firestorm of public debate, and elevated the carbon-dioxide problem to pre-eminence on the environmental agenda, where it remains to this day. Nothing had so galvanized the environmental community since the original Earth Day, eighteen years before.
Historically, the conservation and environmental movements have been rooted in values that celebrate the intrinsic worth of unspoiled landscape and propagate the idea that the human spirit is sustained through communion with nature. More than fifty years ago Aldo Leopold, perhaps the most important environmental voice of the twentieth century, wrote, “We face the question whether a still higher ’standard of living’ is worth its cost in things natural, wild, and free. For us of the minority, ... the chance to find a pasque-flower is a right as inalienable as free speech.” But when global warming appeared, environmentalists thought they had found a justification better than inalienable rights—they had found facts and rationality, and they fell head over heels in love with science.
Of course, modern environmentalists were already in the habit of calling on science to help advance their agenda. In 1967, for example, the Environmental Defense Fund was founded with the aim of using science to support environmental protection through litigation. But global warming was, and is, different. It exists as an environmental issue only because of science. People can’t directly sense global warming, the way they can see a clear-cut forest or feel the sting of urban smog in their throats. It is not a discrete event, like an oil spill or a nuclear accident. Global warming is so abstract that scientists argue over how they would know if they actually observed it. Scientists go to great lengths to measure and derive something called the “global average temperature” at the earth’s surface, and the total rise in this temperature over the past century—an increase of about six tenths of a degree Celsius as of 1998—does suggest warming. But people and ecosystems experience local and regional temperatures, not the global average. Furthermore, most of the possible effects of global warming are not apparent in the present; rather, scientists predict that they will occur decades or even centuries hence. Nor is it likely that scientists will ever be able to attribute any isolated event—a hurricane, a heat wave—to global warming.
A central tenet of environmentalism is that less human interference in nature is better than more. The imagination of the environmental community was ignited not by the observation that greenhouse-gas concentrations were increasing but by the scientific conclusion that the increase was caused by human beings. The Environmental Defense Fund, perhaps because of its explicitly scientific bent, was one of the first advocacy groups to make this connection. As early as 1984 its senior scientist, Michael Oppenheimer, wrote on the op-ed page of The New York Times,
With unusual unanimity, scientists testified at a recent Senate hearing that using the atmosphere as a garbage dump is about to catch up with us on a global scale.... Carbon dioxide emissions from fossil fuel combustion and other “greenhouse” gases are throwing a blanket over the Earth.... The sea level will rise as land ice melts and the ocean expands. Beaches will erode while wetlands will largely disappear.... Imagine life in a sweltering, smoggy New York without Long Island’s beaches and you have glimpsed the world left to future generations.
Preserving tropical jungles and wetlands, protecting air and water quality, slowing global population growth—goals that had all been justified for independent reasons, often by independent organizations—could now be linked to a single fact, anthropogenic carbon-dioxide emissions, and advanced along a single political front, the effort to reduce those emissions. Protecting forests, for example, could help fight global warming because forests act as “sinks” that absorb carbon dioxide. Air pollution could be addressed in part by promoting the same clean-energy sources that would reduce carbon-dioxide emissions. Population growth needed to be controlled in order to reduce demand for fossil-fuel combustion. And the environmental community could reinvigorate its energy-conservation agenda, which had flagged since the early 1980s, when the effects of the second Arab oil shock wore off. Senator Timothy Wirth, of Colorado, spelled out the strategy in 1988: “What we’ve got to do in energy conservation is try to ride the global warming issue. Even if the theory of global warming is wrong, to have approached global warming as if it is real means energy conservation, so we will be doing the right thing anyway in terms of economic policy and environmental policy.” A broad array of environmental groups and think tanks, including the Environmental Defense Fund, the Sierra Club, Greenpeace, the World Resources Institute, and the Union of Concerned Scientists, made reductions in carbon-dioxide emissions central to their agendas.
The moral problem seemed clear: human beings were causing the increase of carbon dioxide in the atmosphere. But the moral problem existed only because of a scientific fact—a fact that not only provided justification for doing many of the things that environmentalists wanted to do anyway but also dictated the overriding course of action: reduce carbon-dioxide emissions. Thus science was used to rationalize the moral imperative, unify the environmental agenda, and determine the political solution.
The summer of 1988 was stultifyingly hot even by Washington, D.C., standards, and the Mississippi River basin was suffering a catastrophic drought. Hansen’s proclamation that the greenhouse effect was “changing our climate now” generated a level of public concern sufficient to catch the attention of many politicians. George Bush, who promised to be “the environmental President” and to counter “the greenhouse effect with the White House effect,” was elected that November. Despite his campaign rhetoric, the new President was unprepared to offer policies that would curtail fossil-fuel production and consumption or impose economic costs for uncertain political gains. Bush’s advisers recognized that support for scientific research offered the best solution politically, because it would give the appearance of action with minimal political risk.
With little debate the Republican Administration and the Democratic Congress in 1990 created the U.S. Global Change Research Program. The program’s annual budget reached $1 billion in 1991 and $1.8 billion in 1995, making it one of the largest science initiatives ever undertaken by the U.S. government. Its goal, according to Bush Administration documents, was “to establish the scientific basis for national and international policymaking related to natural and human-induced changes in the global Earth system.” A central scientific objective was to “support national and international policymaking by developing the ability to predict the nature and consequences of changes in the Earth system, particularly climate change.” A decade and more than $16 billion later, scientific research remains the principal U.S. policy response to climate change.
Meanwhile, the marriage of environmentalism and science gave forth issue: diplomatic efforts to craft a global strategy to reduce carbon-dioxide emissions. Scientists, environmentalists, and government officials, in an attempt to replicate the apparently successful international response to stratospheric-ozone depletion that was mounted in the mid-1980s, created an institutional structure aimed at formalizing the connection between science and political action. The Intergovernmental Panel on Climate Change was established through the United Nations, to provide snapshots of the evolving state of scientific understanding. The IPCC issued major assessments in 1990 and 1996; a third is due early next year. These assessments provide the basis for action under a complementary mechanism, the United Nations Framework Convention on Climate Change. Signed by 154 nations at the 1992 “Earth Summit” in Rio de Janeiro, the convention calls for voluntary reductions in carbon-dioxide emissions. It came into force as an international treaty in March of 1994, and has been ratified by 181 nations. Signatories continue to meet in periodic Conferences of the Parties, of which the most significant to date occurred in Kyoto in 1997, when binding emissions reductions for industrialized countries were proposed under an agreement called the Kyoto Protocol.
The IPCC defines climate change as any sort of change in the earth’s climate, no matter what the cause. But the Framework Convention restricts its definition to changes that result from the anthropogenic emission of greenhouse gases. This restriction has profound implications for the framing of the issue. It makes all action under the convention hostage to the ability of scientists not just to document global warming but to attribute it to human causes. An apparently simple question, Are we causing global warming or aren’t we?, has become the obsessional focus of science—and of policy.
Finally, if the reduction of carbon-dioxide emissions is an organizing principle for environmentalists, scientists, and environmental-policy makers, it is also an organizing principle for all those whose interests might be threatened by such a reduction. It’s easy to be glib about who they might be—greedy oil and coal companies, the rapacious logging industry, recalcitrant automobile manufacturers, corrupt foreign dictatorships—and easy as well to document the excesses and absurdities propagated by some representatives of these groups. Consider, for example, the Greening Earth Society, which “promotes the optimistic scientific view that CO2 is beneficial to humankind and all of nature,” and happens to be funded by a coalition of coal-burning utility companies. One of the society’s 1999 press releases reported that “there will only be sufficient food for the world’s projected population in 2050 if atmospheric concentrations of carbon dioxide are permitted to increase, unchecked.” Of course, neither side of the debate has a lock on excess or distortion. The point is simply that the climate-change problem has been framed in a way that catalyzes a determined and powerful opposition.
When anthropogenic carbon-dioxide emissions became the defining fact for global environmentalism, scientific uncertainty about the causes and consequences of global warming emerged as the apparent central obstacle to action. As we have seen, the Bush Administration justified its huge climate-research initiative explicitly in terms of the need to reduce uncertainty before taking action. Al Gore, by then a senator, agreed, explaining that “more research and better research and better targeted research is absolutely essential if we are going to eliminate the remaining areas of uncertainty and build the broader and stronger political consensus necessary for the unprecedented actions required to address this problem.” Thus did a Republican Administration and a Democratic Congress—one side looking for reasons to do nothing, the other seeking justification for action—converge on the need for more research.
How certain do we need to be before we take action? The answer depends, of course, on where our interests lie. Environmentalists can tolerate a good deal more uncertainty on this issue than can, say, the executives of utility or automobile companies. Science is unlikely to overcome such a divergence in interests. After all, science is not a fact or even a set of facts; rather, it is a process of inquiry that generates more questions than answers. The rise in anthropogenic greenhouse-gas emissions, once it was scientifically established, simply pointed to other questions. How rapidly might carbon-dioxide levels rise in the future? How might climate respond to this rise? What might be the effects of that response? Such questions are inestimably complex, their answers infinitely contestable and always uncertain, their implications for human action highly dependent on values and interests.
Having wedded themselves to science, environmentalists must now cleave to it through thick and thin. When research results do not support their cause, or are simply uncertain, they cannot resort to values-based arguments, because their political opponents can portray such arguments as an opportunistic abandonment of rationality. Environmentalists have tried to get out of this bind by invoking the “precautionary principle”—a dandified version of “better safe than sorry”—to advance the idea that action in the presence of uncertainty is justified if potential harm is great. Thus uncertainty itself becomes an argument for action. But nothing is gained by this tactic either, because just as attitudes toward uncertainty are rooted in individual values and interests, so are attitudes toward potential harm.
Charged by the Framework Convention to search for proof of harm, scientists have turned to computer models of the atmosphere and the oceans, called general circulation models, or GCMs. Carbon-dioxide levels and atmospheric temperatures are measures of the physical state of the atmosphere. GCMs, in contrast, are mathematical representations that scientists use to try to understand past climate conditions and predict future ones. With GCMs scientists seek to explore how climate might respond under different influences—for example, different rates of carbon-dioxide increase. GCMs have calculated global average temperatures for the past century that closely match actual surface-temperature records; this gives climate modelers some confidence that they understand how climate behaves. [an error occurred while processing this directive] Computer models are a bit like Aladdin’s lamp—what comes out is very seductive, but few are privy to what goes on inside. Even the most complex models, however, have one crucial quality that non-experts can easily understand: their accuracy can be fully evaluated only after seeing what happens in the real world over time. In other words, predictions of how climate will behave in the future cannot be proved accurate today. There are other fundamental problems with relying on GCMs. The ability of many models to reproduce temperature records may in part reflect the fact that the scientists who designed them already “knew the answer.” As John Firor, a former director of the National Center for Atmospheric Research, has observed, climate models “are made by humans who tend to shape or use their models in ways that mirror their own notion of what a desirable outcome would be.” Although various models can reproduce past temperature records, and yield similar predictions of future temperatures, they are unable to replicate other observed aspects of climate, such as cloud behavior and atmospheric temperature, and they diverge widely in predicting specific regional climate phenomena, such as precipitation and the frequency of extreme weather events. Moreover, it is simply not possible to know far in advance if the models agree on future temperature because they are similarly right or similarly wrong.
In spite of such pitfalls, a fundamental assumption of both U.S. climate policy and the UN Framework Convention is that increasingly sophisticated models, run on faster computers and supported by more data, will yield predictions that can resolve political disputes and guide action. The promise of better predictions is irresistible to champions of carbon-dioxide reduction, who, after all, must base their advocacy on the claim that anthropogenic greenhouse-gas emissions will be harmful in the future. But regardless of the sophistication of such predictions, new findings will almost inevitably be accompanied by new uncertainties—that’s the nature of science—and may therefore act to fuel, rather than to quench, political debate. Our own prediction is that increasingly complex mathematical models that delve ever more deeply into the intricacies and the uncertainties of climate will only hinder political action.
An example of how more scientific research fuels political debate came in 1998, when a group of prominent researchers released the results of a model analyzing carbon-dioxide absorption in North America. Their controversial findings, published in the prestigious journal Science, suggested that the amount of carbon dioxide absorbed by U.S. forests might be greater than the amount emitted by the nation’s fossil-fuel combustion. This conclusion has two astonishing implications. First, the United States—the world’s most profligate energy consumer—may not be directly contributing to rising atmospheric levels of carbon dioxide. Second, the atmosphere seems to be benefiting from young forests in the eastern United States that are particularly efficient at absorbing carbon dioxide. But these young forests exist only because old-growth forests were clear-cut in the eighteenth and nineteenth centuries to make way for farms that were later abandoned in favor of larger, more efficient midwestern farms. In other words, the possibility that the United States is a net carbon-dioxide sink does not reflect efforts to protect the environment; on the contrary, it reflects a history of deforestation and development.
Needless to say, these results quickly made their way into the political arena. At a hearing of the House Resources Committee, Representative John E. Peterson, of Pennsylvania, a Republican, asserted, “There are recent studies that show that in the Northeast, where we have continued to cut timber, and have a regenerating, younger forest, that the greenhouse gases are less when they leave the forest.... So a young, growing, vibrant forest is a whole lot better for clean air than an old dying forest.” George Frampton, the director of the White House Council on Environmental Quality, countered, “The science on this needs a lot of work.... we need more money for scientific research to undergird that point of view.” How quickly the tables can turn: here was a conservative politician wielding (albeit with limited coherence) the latest scientific results to justify logging old-growth forests in the name of battling global warming, while a Clinton Administration official backpedaled in the manner more typically adopted by opponents of action on climate change—invoking the need for more research.
That’s a problem with science—it can turn around and bite you. An even more surprising result has recently emerged from the study of Antarctic glaciers. A strong argument in favor of carbon-dioxide reduction has been the possibility that if temperatures rise owing to greenhouse-gas emissions, glaciers will melt, the sea level will rise, and populous coastal zones all over the world will be inundated. The West Antarctic Ice Sheet has been a subject of particular concern, both because of evidence that it is now retreating and because of geologic studies showing that it underwent catastrophic collapse at least once in the past million years or so. “Behind the reasoned scientific estimates,” Greenpeace warns, “lies the possibility of ... the potential catastrophe of a six metre rise in sea level.” But recent research from Antarctica shows that this ice sheet has been melting for thousands of years. Sea-level rise is a problem, but anthropogenic global warming is not the only culprit, and reducing emissions cannot be the only solution.
To make matters more difficult, some phenomena, especially those involving human behavior, are intrinsically unpredictable. Any calculation of future anthropogenic global warming must include an estimate of rates of fossil-fuel combustion in the coming decades. This means that scientists must be able to predict not only the amounts of coal, oil, and natural gas that will be consumed but also changes in the mixture of fossil fuels and other energy sources, such as nuclear, hydro-electric, and solar. These predictions rest on interdependent factors that include energy policies and prices, rates of economic growth, patterns of industrialization and technological innovation, changes in population, and even wars and other geopolitical events. Scientists have no history of being able to predict any of these things. For example, their inability to issue accurate population projections is “one of the best-kept secrets of demography,” according to Joel Cohen, the director of the Laboratory of Populations at Rockefeller University. “Most professional demographers no longer believe they can predict precisely the future growth rate, size, composition and spatial distribution of populations,” Cohen has observed.
Predicting the human influence on climate also requires an understanding of how climate behaved “normally,” before there was any such influence. But what are normal climate patterns? In the absence of human influence, how stationary is climate? To answer such questions, researchers must document and explain the behavior of the pre-industrial climate, and they must also determine how the climate would have behaved over the past two centuries had human beings not been changing the composition of the atmosphere. However, despite the billions spent so far on climate research, Kevin Trenberth, a senior scientist at the National Center for Atmospheric Research, told the Chicago Tribune last year, “This may be a shock to many people who assume that we do know adequately what’s going on with the climate, but we don’t.” The National Academy of Sciences reported last year that “deficiencies in the accuracy, quality, and continuity of the [climate] records ... place serious limitations on the confidence” of research results.
If the normal climate is non-stationary, then the task of identifying the human fingerprint in global climate change becomes immeasurably more difficult. And the idea of a naturally stationary climate may well be chimerical. Climate has changed often and dramatically in the recent past. In the 1940s and 1950s, for example, the East Coast was hammered by a spate of powerful hurricanes, whereas in the 1970s and 1980s hurricanes were much less common. What may appear to be “abnormal” hurricane activity in recent years is abnormal only in relation to this previous quiet period. As far as the ancient climate goes, paleoclimatologists have found evidence of rapid change, even over periods as short as several years. Numerous influences could account for these changes. Ash spewed high into the atmosphere by large volcanoes can reflect solar radiation back into space and result in short-term cooling, as occurred after the 1991 eruption of Mount Pinatubo. Variations in the energy emitted by the sun also affect climate, in ways that are not yet fully understood. Global ocean currents, which move huge volumes of warm and cold water around the world and have a profound influence on climate, can speed up, slow down, and maybe even die out over very short periods of time—perhaps less than a decade. Were the Gulf Stream to shut down, the climate of Great Britain could come to resemble that of Labrador.
Finally, human beings have been changing the surface of the earth for millennia. Scientists increasingly realize that deforestation, agriculture, irrigation, urbanization, and other human activities can lead to major changes in climate on a regional or perhaps even a global scale. Thomas Stohlgren, of the U.S. Geological Survey, has written, “The effects of land use practices on regional climate may overshadow larger-scale temperature changes commonly associated with observed increases in carbon dioxide.” The idea that climate may constantly be changing for a variety of reasons does not itself undercut the possibility that anthropogenic carbon dioxide could seriously affect the global climate, but it does confound scientific efforts to predict the consequences of carbon-dioxide emissions.
If predicting how climate will change is difficult and uncertain, predicting how society will be affected by a changing climate—especially at the local, regional, and national levels, where decision-making takes place—is immeasurably more so. And predicting the impact on climate of reducing carbon-dioxide emissions is so uncertain as to be meaningless. What we do know about climate change suggests that there will be winners and losers, with some areas and nations potentially benefiting from, say, longer growing seasons or more rain, and others suffering from more flooding or drought. But politicians have no way to accurately calibrate the effects-human and economic—of global warming, or the benefits of reducing carbon-dioxide emissions.
Imagine yourself a leading policymaker in a poor, overpopulated, undernourished nation with severe environmental problems. What would it take to get you worried about global warming? You would need to know not just that global warming would make the conditions in your country worse but also that any of the scarce resources you applied to reducing carbon-dioxide emissions would lead to more benefits than if they were applied in another area, such as industrial development or housing construction. Such knowledge is simply unavailable. But you do know that investing in industrial development or better housing would lead to concrete political, economic, and social benefits.
More specifically, suppose that many people in your country live in shacks on a river’s floodplain. Floodplains are created and sustained by repeated flooding, so floods are certain to occur in the future, regardless of global warming. Given a choice between building new houses away from the floodplain and converting power plants from cheap local coal to costlier imported fuels, what would you do? New houses would ensure that lives and homes would be saved; a new power plant would reduce carbon-dioxide emissions but leave people vulnerable to floods. In the developing world the carbon-dioxide problem pales alongside immediate environmental and developmental problems. The China Daily reported during the 1997 Kyoto Conference:
The United States ... and other nations made the irresponsible demand ... that the developing countries should make commitments to limiting greenhouse gas emissions.... As a developing country, China has 60 million poverty-stricken people and China’s per capita gas emissions are only one-seventh of the average amount of more developed countries. Ending poverty and developing the economy must still top the agenda of [the] Chinese government.
For the most part, the perspectives of those in the developing world—about 80 percent of the planet’s population—have been left outside the frame of the climate-change discussion. This is hardly surprising, considering that the frame was defined mainly by environmentalists and scientists in affluent nations. Developing nations, meanwhile, have quite reasonably refused to agree to the targets for carbon-dioxide reduction set under the Kyoto Protocol. The result may feel like a moral victory to some environmentalists, who reason that industrialized countries, which caused the problem to begin with, should shoulder the primary responsibility for solving it. But the victory is hollow, because most future emissions increases will come from the developing world. In affluent nations almost everyone already owns a full complement of energy-consuming devices. Beyond a certain point increases in income do not result in proportional increases in energy consumption; people simply trade in the old model for a new and perhaps more efficient one. If present trends continue, emissions from the developing world are likely to exceed those from the industrialized nations within the next decade or so.
Twelve years after carbon dioxide became the central obsession of global environmental science and politics, we face the following two realities:
First, atmospheric carbon-dioxide levels will continue to increase. The Kyoto Protocol, which represents the world’s best attempt to confront the issue, calls for industrialized nations to reduce their emissions below 1990 levels by the end of this decade. Political and technical realities suggest that not even this modest goal will be achieved. To date, although eighty-four nations have signed the Kyoto Protocol, only twenty-two nations—half of them islands, and none of them major carbon-dioxide emitters—have ratified it. The United States Senate, by a vote of 95-0 in July of 1997, indicated that it would not ratify any climate treaty that lacked provisions requiring developing nations to reduce their emissions. The only nations likely to achieve the emissions commitments set under Kyoto are those, like Russia and Ukraine, whose economies are in ruins. And even successful implementation of the treaty would not halt the progressive increase in global carbon-dioxide emissions.
Second, even if greenhouse-gas emissions could somehow be rolled back to pre-industrial levels, the impacts of climate on society and the environment would continue to increase. Climate affects the world not just through phenomena such as hurricanes and droughts but also because of societal and environmental vulnerability to such phenomena. The horrific toll of Hurricane Mitch reflected not an unprecedented climatic event but a level of exposure typical in developing countries where dense and rapidly increasing populations live in environmentally degraded conditions. Similar conditions underlay more-recent disasters in Venezuela and Mozambique.
If these observations are correct, and we believe they are essentially indisputable, then framing the problem of global warming in terms of carbon-dioxide reduction is a political, environmental, and social dead end. We are not suggesting that humanity can with impunity emit billions of tons of carbon dioxide into the atmosphere each year, or that reducing those emissions is not a good idea. Nor are we making the nihilistic point that since climate undergoes changes for a variety of reasons, there is no need to worry about additional changes imposed by human beings. Rather, we are arguing that environmentalists and scientists, in focusing their own, increasingly congruent interests on carbon-dioxide emissions, have framed the problem of global environmental protection in a way that can offer no realistic prospect of a solution.
Local weather is the day-to-day manifestation of global climate. Weather is what we experience, and lately there has been plenty to experience. In recent decades human, economic, and environmental losses from disasters related to weather have increased dramatically. Insurance-industry data show that insured losses from weather have been rising steadily. A 1999 study by the German firm Munich Reinsurance Company compared the 1960s with the 1990s and concluded that “the number of great natural catastrophes increased by a factor of three, with economic losses—taking into account the effects of inflation—increasing by a factor of more than eight and insured losses by a factor of no less than sixteen.” And yet scientists have been unable to observe a global increase in the number or the severity of extreme weather events. In 1996 the IPCC concluded, “There is no evidence that extreme weather events, or climate variability, has increased, in a global sense, through the 20th century, although data and analyses are poor and not comprehensive.”
What has unequivocally increased is society’s vulnerability to weather. At the beginning of the twentieth century the earth’s population was about 1.6 billion people; today it is about six billion people. Almost four times as many people are exposed to weather today as were a century ago. And this increase has, of course, been accompanied by enormous increases in economic activity, development, infrastructure, and interdependence. In the past fifty years, for example, Florida’s population rose fivefold; 80 percent of this burgeoning population lives within twenty miles of the coast. The great Miami hurricane of 1926 made landfall over a small, relatively poor community and caused about $76 million worth of damage (in inflation-adjusted dollars). Today a storm of similar magnitude would strike a sprawling, affluent metropolitan area of two million people, and could cause more than $80 billion worth of damage. The increase in vulnerability is far more dramatic in the developing world, where in an average year tens of thousands of people die in weather-related disasters. According to the World Disasters Report 1999, 80 million people were made homeless by weather-related disasters from 1988 to 1997. As the population and vulnerability of the developing world continue to rise, such numbers will continue to rise as well, with or without global warming.
Environmental vulnerability is also on the rise. The connections between weather impacts and environmental quality are immediate and obvious—much more so than the connections between global warming and environmental quality. Deforestation, the destruction of wetlands, and the development of fragile coastlines can greatly magnify flooding; floods, in turn, can mobilize toxic chemicals in soil and storage facilities and cause devastating pollution of water sources and harm to wildlife. Poor agricultural, forest-management, and grazing practices can exacerbate the effects of drought, amplify soil erosion, and promote the spread of wildfires. Damage to the environment due to deforestation directly contributed to the devastation wrought by Hurricane Mitch, as denuded hillsides washed away in catastrophic landslides, and excessive development along unmanaged floodplains put large numbers of people in harm’s way. [an error occurred while processing this directive] Our view of climate and the environment draws on people’s direct experience and speaks to widely shared values. It therefore has an emotional and moral impact that can translate into action. This view is framed by four precepts. First, the impacts of weather and climate are a serious threat to human welfare in the present and are likely to get worse in the future. Second, the only way to reduce these impacts is to reduce societal vulnerability to them. Third, reducing vulnerability can be achieved most effectively by encouraging democracy, raising standards of living, and improving environmental quality in the developing world. Fourth, such changes offer the best prospects not only for adapting to a capricious climate but also for reducing carbon-dioxide emissions.
The implicit moral imperative is not to prevent human disruption of the environment but to ameliorate the social and political conditions that lead people to behave in environmentally disruptive ways. This is a critical distinction—and one that environmentalists and scientists embroiled in the global-warming debate have so far failed to make.
To begin with, any global effort to reduce vulnerability to weather and climate must address the environmental conditions in developing nations. Poor land-use and natural-resource-management practices are, of course, a reflection of poverty, but they are also caused by government policies, particularly those that encourage unsustainable environmental activities. William Ascher, a political scientist at Duke University, has observed that such policies typically do not arise out of ignorance or lack of options but reflect conscious tradeoffs made by government officials faced with many competing priorities and political pressures. Nations, even poor ones, have choices. It was not inevitable, for example, that Indonesia would promote the disastrous exploitation of its forests by granting subsidized logging concessions to military and business leaders. This was the policy of an autocratic government seeking to manipulate powerful sectors of society. In the absence of open, democratically responsive institutions, Indonesian leaders were not accountable for the costs that the public might bear, such as increased vulnerability to floods, landslides, soil erosion, drought, and fire. Promoting democratic institutions in developing nations could be the most important item on an agenda aimed at protecting the global environment and reducing vulnerability to climate. Environmental groups concerned about the consequences of climate change ought to consider reorienting their priorities accordingly.
Such long-term efforts must be accompanied by activities with a shorter-term payoff. An obvious first step would be to correct some of the imbalances created by the obsession with carbon dioxide. For example, the U.S. Agency for International Development has allocated $1 billion over five years to help developing nations quantify, monitor, and reduce greenhouse-gas emissions, but is spending less than a tenth of that amount on programs to prepare for and prevent disasters. These priorities should be rearranged. Similarly, the United Nations’ International Strategy for Disaster Reduction is a relatively low-level effort that should be elevated to a status comparable to that of the Framework Convention on Climate Change.
Intellectual and financial resources are also poorly allocated in the realm of science, with research focused disproportionately on understanding and predicting basic climatic processes. Such research has yielded much interesting information about the global climate system. But little priority is given to generating and disseminating knowledge that people and communities can use to reduce their vulnerability to climate and extreme weather events. For example, researchers have made impressive strides in anticipating the impacts of some relatively short-term climatic phenomena, notably El Niño and La Niña. If these advances were accompanied by progress in monitoring weather, identifying vulnerable regions and populations, and communicating useful information, we would begin to reduce the toll exacted by weather and climate all over the world.
A powerful international mechanism for moving forward already exists in the Framework Convention on Climate Change. The language of the treaty offers sufficient flexibility for new priorities. The text states that signatory nations have an obligation to “cooperate in preparing for adaptation to the impacts of climate change [and to] develop and elaborate appropriate and integrated plans for coastal zone management, water resources and agriculture, and for the protection and rehabilitation of areas ... affected by drought and desertification, as well as floods.”
The idea of improving our adaptation to weather and climate has been taboo in many circles, including the realms of international negotiation and political debate. “Do we have so much faith in our own adaptability that we will risk destroying the integrity of the entire global ecological system?” Vice President Gore asked in his book Earth in the Balance (1992). “Believing that we can adapt to just about anything is ultimately a kind of laziness, an arrogant faith in our ability to react in time to save our skin.” For environmentalists, adaptation represents a capitulation to the momentum of human interference in nature. For their opponents, putting adaptation on the table would mean acknowledging the reality of global warming. And for scientists, focusing on adaptation would call into question the billions of tax dollars devoted to research and technology centered on climate processes, models, and predictions.
Yet there is a huge potential constituency for efforts focused on adaptation: everyone who is in any way subject to the effects of weather. Reframing the climate problem could mobilize this constituency and revitalize the Framework Convention. The revitalization could concentrate on coordinating disaster relief, debt relief, and development assistance, and on generating and providing information on climate that participating countries could use in order to reduce their vulnerability.
An opportunity to advance the cause of adaptation is on the horizon. The U.S. Global Change Research Program is now finishing its report on the National Assessment of the Potential Consequences of Climate Variability and Change. The draft includes examples from around the United States of why a greater focus on adaptation to climate makes sense. But it remains to be seen if the report will redefine the terms of the climate debate, or if it will simply become fodder in the battle over carbon-dioxide emissions.
Finally, efforts to reduce carbon-dioxide emissions need not be abandoned. The Framework Convention and its offshoots also offer a promising mechanism for promoting the diffusion of energy-efficient technologies that would reduce emissions. Both the convention and the Kyoto Protocol call on industrialized nations to share new energy technologies with the developing world. But because these provisions are coupled to carbon-dioxide-reduction mandates, they are trapped in the political gridlock. They should be liberated, promoted independently on the basis of their intrinsic environmental and economic benefits, and advanced through innovative funding mechanisms. For example, as the United Nations Development Programme has suggested, research into renewable-energy technologies for poor countries could be supported in part by a modest levy on patents registered under the World Intellectual Property Organization. Such ideas should be far less divisive than energy policies advanced on the back of the global-warming agenda.
As an organizing principle for political action, vulnerability to weather and climate offers everything that global warming does not: a clear, uncontroversial story rooted in concrete human experience, observable in the present, and definable in terms of unambiguous and widely shared human values, such as the fundamental rights to a secure shelter, a safe community, and a sustainable environment. In this light, efforts to blame global warming for extreme weather events seem maddeningly perverse-as if to say that those who died in Hurricane Mitch were symbols of the profligacy of industrialized society, rather than victims of poverty and the vulnerability it creates.
Such perversity shows just how morally and politically dangerous it can be to elevate science above human values. In the global-warming debate the logic behind public discourse and political action has been precisely backwards. Environmental prospects for the coming century depend far less on our strategies for reducing carbon-dioxide emissions than on our determination and ability to reduce human vulnerability to weather and climate.