Return to this issue's Table of Contents.
A P R I L 1 9 9 9
by John Lewis Gaddis
Since the Cold War ended, we have seen victims of genocide being disinterred in Central Europe; African rivers choked with mutilated bodies; armed teenagers ruling Third World cities from the backs of pickup trucks; defeated dictators refusing to accept their own defeat; women forced back into isolation in the name of religion; emigrants clamoring to abandon old cultures for new ones they know only from television; terrorism striking with deadly efficiency where one might least expect it -- in the American heartland. It is enough to evoke a certain nostalgia for the old world order. The new one, as the French author Philippe Delmas points out, contains far too many people who are prepared "to turn around and disembowel one another over an acre of land, a hamlet, or some ancient totem."
Discuss this article in Post &
More Atlantic articles looking ahead to the 21st century.
From the archives:
"The Conceptual Poverty of U.S. Foreign Policy," by Jonathan Clarke (September, 1993)
We have heard it now from two Administrations, two parties, in a row: yes, the Cold War is over, but the world is more dangerous, because less predictable, than it was while the Cold War was still on. The world is indeed dangerous, the author argues, but not more dangerous to the United States.
"Why We Will Soon Miss the Cold War," by John Mearsheimer (August, 1990)
The conditions that have made for decades of peace in the West are fast disappearing, as Europe prepares to return to the multi-polar system that, between 1648 and 1945, bred one destructive conflict after another.
"What Should We Do in the World?", by Stanley Hoffman (October, 1989)
The dominant foreign-policy goals of the United States were long essentially reactive; they were defined by the Cold War with the Soviet Union. That will no longer do: we have to confront the new time with a new question.
What happened? How did patterns of behavior that most of us had thought buried
in the past suddenly become our future? It might help, in explaining these
unpleasant surprises, to retrieve a different image from the year 1989. The
date was October 17, the time 5:04 P.M. Pacific Daylight, the place San
Francisco's Candlestick Park. The Oakland Athletics and the San Francisco
Giants were about to begin the third game of the World Series when a distant
rumbling suddenly became an uncomfortable shaking, and the great Loma Prieta
earthquake proceeded to pre-empt everything planned for that afternoon and for
some time to come. Television cameras, before they were knocked off the air,
caught the astonishment on the faces of players, fans, and anchorpersons alike
as they abruptly acquired a Shakespearean insight: that there were more things
in heaven and earth than had been dreamt of -- or at least adequately taken into
account -- in their philosophy.
GAMES and the settings in which games are played are very different things. The Cold War once seemed a matter of life and death; but as the years rolled by and the Apocalypse did not arrive, it took on the character of a latter-day "great game," reminiscent of the long nineteenth-century conflict between the British and the Russians in Asia, which never quite produced a great war. Even the language of the Cold War became that of games: policymakers warned gravely of falling dominoes; theorists built billiard-ball models of world politics; critics of détente complained that the Soviet Union was playing chess while the most the Americans were managing was checkers. And in the end -- whatever Washington's ineptitude at chess -- the West somehow "won."
Implicit in all these metaphors was an important assumption: that however intense the rivalry, no one was going to hurl the checkers and chess pieces to the floor, or run off with the dominoes, or rip the billiard table's fabric down the middle. Whatever the game might be, the playing field would remain level. No earthquakes were anticipated. Even a nuclear war, a few strategists once thought, might be fought within certain "rules of the game," and an entire discipline -- game theory -- grew out of efforts to discover what those rules might be.
Today, though, the metaphors have shifted: geopoliticians sound more like geologists than like game theorists. The political scientist Samuel P. Huntington warns of "fault line conflicts" in which clashes of civilizations are bound to occur. The economist Lester C. Thurow sees "tectonic plates" colliding, with unpredictable consequences. The journalist Robert D. Kaplan predicts that seismic shocks will result from demographic and ecological pressures: "Though the havoc is unanticipated, stresses that build up gradually over the years cause the layers of crust to shift suddenly."
This new "tectonic" geopolitics suggests the need to rethink an old conflict. The Soviet-American "great game," it now appears, was taking place all along within an international system -- in effect, an arena -- whose stability we should not have taken for granted. Reminiscent of the teams at Candlestick Park, the Cold War superpowers competed even as historical processes of which they were only dimly aware were determining their future. It took the upheavals of 1989 to reveal these: to make it clear that the old rules, even the old games, may no longer apply.
The end of the Cold War was an earthquakelike event in that it revealed deep and hitherto hidden sources of geopolitical strain. As is often the case in geology, though, it has taken a while to map these, and to find the faults they have produced. First impressions were that the critical fracture lay between democracy and capitalism on the one hand, and authoritarianism on the other. The Soviet Union collapsed, according to this view, because it was unable to feed or free its people at a time when prosperity and liberty had become normal for most of the rest of the developed world. Kremlin leaders found themselves in a classic Catch-22: their country could save itself only by ceasing to be what it was.
But this geopolitical map implied that democratization and marketization proceed in the same direction -- that no fault exists between them. If that were indeed the case, the disappearance of Soviet authoritarianism should have produced a stable post-Cold War landscape -- one in which the United States, which has sought a world safe for democracy and capitalism since at least the days of Woodrow Wilson, should be relatively comfortable. This has not happened, though. The aftershocks are continuing, and few Americans -- or others, for that matter -- feel at ease among them. So perhaps larger fractures lie elsewhere.
The tremors originate, some geopoliticians now believe, along a deeper fault, which separates processes of economic globalization and political fragmentation that began well before the Cold War and are sure to survive it. Ian Clark, of the University of Wales, Aberystwyth, explains "globalization" as "integration, interdependence, multilateralism, openness, and interpenetration." "Fragmentation," conversely, involves "disintegration, autarchy, unilateralism, . . . separatism, and heterogeneity." What is unsettling about this geopolitical map is that the fault it traces could be threatening the stability of all great powers. As the shakiest among them, the Soviet Union would simply have been the first to go.
States justify their existence, in large part, by securing their citizens' well-being, whether by creating and maintaining jobs, providing a social safety net, or protecting the environment. A regime that must leave its people at the mercy of market forces is not likely to enhance its reputation in their eyes. And yet globalization requires placing national economies within an inherently unpredictable international marketplace. When corporations can base themselves anywhere, when capital crosses boundaries as easily as birds do, when communication takes place at the speed of light and at virtually no cost, governments have little choice but to learn to live with invisible hands. A laissez-faire economic system is emerging at the global level a century after the invention of the welfare state limited laissez-faire economics at the national level. The social and political compromises that saved capitalism through an expansion of state authority early in the twentieth century no longer constrain it. And states now are as ill positioned as towns and villages were then to resist the buffeting of markets, or to relieve the dislocations they can produce.
Meanwhile, political fragmentation, by proliferating sovereignties, is diminishing sovereignty. It was easy to applaud the formation of new states when the result was to break up the old European colonial empires, or to bring down the former Soviet Union. But the process has not stopped there. Democracies, too, are feeling the centrifugal forces of separatism, as the Canadians, the British, the Spanish, the Belgians, and the Italians can testify. Indeed, with their respect for the principle of self-determination, democracies may be particularly vulnerable to such pressures. How many of them today would follow the American example and fight a blood-drenched civil war to deny some portion of their own citizenry the right to secede? And yet can we assume -- with examples like Chechnya and the former Yugoslavia in mind -- that secessions will always promote peace and justice? In a world of weaker states politics could become as volatile and indifferent as economics already is.
Thus states are getting hit from both sides: whether as the result of global economic trends or of particularist political pressures, their authority is diminishing. Since the prevailing view throughout most of this century has been that the power of states was increasing, this is very big news indeed. It is roughly the equivalent of finding that the San Andreas fault runs right under one's house. Confronted with such information, one would want to try, at a minimum, to understand the tectonics involved, to anticipate the damage when the fault finally slips, and to take whatever precautions might be possible now to shore up foundations, reinforce walls, and stabilize crockery.
They certainly did not produce peace. But the organized wars of the eighteenth century were a distinct improvement over what had preceded them -- notably the Thirty Years' War, of 1618-1648, "an anarchic free-for-all of violently changing fortunes," as the historian David Kaiser has described it, which may have reduced Germany's population by as much as half. When, therefore, the early-nineteenth-century Prussian strategist Carl von Clausewitz wrote that war was an extension of policy by other means, he was not so much glorifying war as reacting against its excesses. Having lived through, and fought in, the Napoleonic Wars, he had every reason to know what the unconstrained use of force might involve.
Clausewitz insisted on the control of military conflict: on limiting violence to the minimum necessary to achieve belligerents' objectives. He was sensitive, as few other strategists have been, to war's unpredictabilities. He knew how easily terror, fatigue, and friction can frustrate even the most sophisticated planning. But this fear of chaos -- of losing control -- made all the more compelling Clausewitz's insistence that the initiation and conduct of war should be rational acts, in the sense of maintaining as close a correspondence as possible between the purposes of violence and its scale.
Subsequent wars, especially the two world wars, did not always meet that standard: hence their conduct has often been criticized from a Clausewitzian perspective. But those who started them sought to link available force with intended objectives. "Statesmen have sometimes been surprised by the nature of the war they have unleashed," Sir Michael Howard, one of the most astute students of Clausewitz, has pointed out, "and it is reasonable to assume that in at least fifty per cent of the cases they got a result they did not expect. But that is not the same as a war begun by mistake and continued with no political purpose."
The Cold War, in contrast, was Clausewitzian to the core. With the development of nuclear weapons, the means of violence had swollen to unimaginable proportions; but the great powers maintained such tight control that none resorted to any of those devices. Confronted with the possibility of their use, leaders as dissimilar as Eisenhower, Khrushchev, Macmillan, De Gaulle, and Mao Zedong found common ground in the urgency of living to play the game another day. Clausewitz's insistence that the instruments of violence not overwhelm the uses to which they are put has served us well, therefore. We probably owe our survival to it.
Cold War statesmen behaved so rationally, in fact, that theorists today rely heavily on "rational choice" models in thinking about the future. "Realists" and "neo-realists" assume that states know their interests and will consistently pursue them; a few have even advocated the controlled proliferation of nuclear weapons, apparently on the grounds that if these weapons induced rationality during the Cold War, they will do so at all times and in all places. Political economists, assuming aggregate if not individual rationality, are confident that states contemplating war in a globally interdependent economy will find that they cannot afford it. Democratic peace theory, too, takes rationality as a given. The argument here is that since no democracy has ever gone to war with another democracy, such states must prefer and will therefore choose peaceful over violent means of resolving disputes with each other. The number of democracies is increasing; so, too, should the prospects for peace.
Game theorists have even devised a mathematical concept to simulate these patterns of behavior, and its name is revealing: expected utility. The assumption, quite simply, is that people act only when they anticipate benefits from their actions. And given existing military, economic, and political realities, it is hard to see -- from a rational-choice perspective, at least -- how starting a war could benefit anyone in this day and age.
The online version of this article appears in two parts. Click here to go to part two.
John Lewis Gaddis is the Robert Lovett Professor of History at Yale University. His books include The United States and the End of the Cold War: Implications, Reconsiderations, Provocations (1992) and We Now Know: Rethinking Cold-War History (1997).
Illustrations by Mirko Ilic
Copyright © 1999 by The Atlantic Monthly Company. All rights reserved.
The Atlantic Monthly; April 1999; Living in Candlestick Park; Volume 283, No. 4; pages 65 - 74.