Living in Candlestick Park
In the twenty-first century geopolitics might well take its metaphors from geology, as the state system of international relations gets shaken to its foundations

Since the Cold War ended, we have seen victims of genocide being disinterred in Central Europe; African rivers choked with mutilated bodies; armed teenagers ruling Third World cities from the backs of pickup trucks; defeated dictators refusing to accept their own defeat; women forced back into isolation in the name of religion; emigrants clamoring to abandon old cultures for new ones they know only from television; terrorism striking with deadly efficiency where one might least expect it -- in the American heartland. It is enough to evoke a certain nostalgia for the old world order. The new one, as the French author Philippe Delmas points out, contains far too many people who are prepared "to turn around and disembowel one another over an acre of land, a hamlet, or some ancient totem."
What happened? How did patterns of behavior that most of us had thought buried in the past suddenly become our future? It might help, in explaining these unpleasant surprises, to retrieve a different image from the year 1989. The date was October 17, the time 5:04 P.M. Pacific Daylight, the place San Francisco's Candlestick Park. The Oakland Athletics and the San Francisco Giants were about to begin the third game of the World Series when a distant rumbling suddenly became an uncomfortable shaking, and the great Loma Prieta earthquake proceeded to pre-empt everything planned for that afternoon and for some time to come. Television cameras, before they were knocked off the air, caught the astonishment on the faces of players, fans, and anchorpersons alike as they abruptly acquired a Shakespearean insight: that there were more things in heaven and earth than had been dreamt of -- or at least adequately taken into account -- in their philosophy.

GAMES and the settings in which games are played are very different things. The Cold War once seemed a matter of life and death; but as the years rolled by and the Apocalypse did not arrive, it took on the character of a latter-day "great game," reminiscent of the long nineteenth-century conflict between the British and the Russians in Asia, which never quite produced a great war. Even the language of the Cold War became that of games: policymakers warned gravely of falling dominoes; theorists built billiard-ball models of world politics; critics of détente complained that the Soviet Union was playing chess while the most the Americans were managing was checkers. And in the end -- whatever Washington's ineptitude at chess -- the West somehow "won."
Implicit in all these metaphors was an important assumption: that however intense the rivalry, no one was going to hurl the checkers and chess pieces to the floor, or run off with the dominoes, or rip the billiard table's fabric down the middle. Whatever the game might be, the playing field would remain level. No earthquakes were anticipated. Even a nuclear war, a few strategists once thought, might be fought within certain "rules of the game," and an entire discipline -- game theory -- grew out of efforts to discover what those rules might be.
Today, though, the metaphors have shifted: geopoliticians sound more like geologists than like game theorists. The political scientist Samuel P. Huntington warns of "fault line conflicts" in which clashes of civilizations are bound to occur. The economist Lester C. Thurow sees "tectonic plates" colliding, with unpredictable consequences. The journalist Robert D. Kaplan predicts that seismic shocks will result from demographic and ecological pressures: "Though the havoc is unanticipated, stresses that build up gradually over the years cause the layers of crust to shift suddenly."
This new "tectonic" geopolitics suggests the need to rethink an old conflict. The Soviet-American "great game," it now appears, was taking place all along within an international system -- in effect, an arena -- whose stability we should not have taken for granted. Reminiscent of the teams at Candlestick Park, the Cold War superpowers competed even as historical processes of which they were only dimly aware were determining their future. It took the upheavals of 1989 to reveal these: to make it clear that the old rules, even the old games, may no longer apply.
WHEN geologists want to know the future, they look at the past: at the record of forces operating beneath the earth's crust that drive everything from the imperceptibly slow drifting of continents to the fault-line slips that cause catastrophic earthquakes. This method assumes -- safely enough, it would seem -- that processes in motion for millions of years are not about to disappear or reverse themselves overnight. Geologists still cannot say just where or exactly when a fault will give way. But that it will, sooner or later, is as certain as anything on this planet can be.
The end of the Cold War was an earthquakelike event in that it revealed deep and hitherto hidden sources of geopolitical strain. As is often the case in geology, though, it has taken a while to map these, and to find the faults they have produced. First impressions were that the critical fracture lay between democracy and capitalism on the one hand, and authoritarianism on the other. The Soviet Union collapsed, according to this view, because it was unable to feed or free its people at a time when prosperity and liberty had become normal for most of the rest of the developed world. Kremlin leaders found themselves in a classic Catch-22: their country could save itself only by ceasing to be what it was.
But this geopolitical map implied that democratization and marketization proceed in the same direction -- that no fault exists between them. If that were indeed the case, the disappearance of Soviet authoritarianism should have produced a stable post-Cold War landscape -- one in which the United States, which has sought a world safe for democracy and capitalism since at least the days of Woodrow Wilson, should be relatively comfortable. This has not happened, though. The aftershocks are continuing, and few Americans -- or others, for that matter -- feel at ease among them. So perhaps larger fractures lie elsewhere.
The tremors originate, some geopoliticians now believe, along a deeper fault, which separates processes of economic globalization and political fragmentation that began well before the Cold War and are sure to survive it. Ian Clark, of the University of Wales, Aberystwyth, explains "globalization" as "integration, interdependence, multilateralism, openness, and interpenetration." "Fragmentation," conversely, involves "disintegration, autarchy, unilateralism, . . . separatism, and heterogeneity." What is unsettling about this geopolitical map is that the fault it traces could be threatening the stability of all great powers. As the shakiest among them, the Soviet Union would simply have been the first to go.
States justify their existence, in large part, by securing their citizens' well-being, whether by creating and maintaining jobs, providing a social safety net, or protecting the environment. A regime that must leave its people at the mercy of market forces is not likely to enhance its reputation in their eyes. And yet globalization requires placing national economies within an inherently unpredictable international marketplace. When corporations can base themselves anywhere, when capital crosses boundaries as easily as birds do, when communication takes place at the speed of light and at virtually no cost, governments have little choice but to learn to live with invisible hands. A laissez-faire economic system is emerging at the global level a century after the invention of the welfare state limited laissez-faire economics at the national level. The social and political compromises that saved capitalism through an expansion of state authority early in the twentieth century no longer constrain it. And states now are as ill positioned as towns and villages were then to resist the buffeting of markets, or to relieve the dislocations they can produce.
Meanwhile, political fragmentation, by proliferating sovereignties, is diminishing sovereignty. It was easy to applaud the formation of new states when the result was to break up the old European colonial empires, or to bring down the former Soviet Union. But the process has not stopped there. Democracies, too, are feeling the centrifugal forces of separatism, as the Canadians, the British, the Spanish, the Belgians, and the Italians can testify. Indeed, with their respect for the principle of self-determination, democracies may be particularly vulnerable to such pressures. How many of them today would follow the American example and fight a blood-drenched civil war to deny some portion of their own citizenry the right to secede? And yet can we assume -- with examples like Chechnya and the former Yugoslavia in mind -- that secessions will always promote peace and justice? In a world of weaker states politics could become as volatile and indifferent as economics already is.
Thus states are getting hit from both sides: whether as the result of global economic trends or of particularist political pressures, their authority is diminishing. Since the prevailing view throughout most of this century has been that the power of states was increasing, this is very big news indeed. It is roughly the equivalent of finding that the San Andreas fault runs right under one's house. Confronted with such information, one would want to try, at a minimum, to understand the tectonics involved, to anticipate the damage when the fault finally slips, and to take whatever precautions might be possible now to shore up foundations, reinforce walls, and stabilize crockery.
STATES as we know them date back only about five centuries. Other ways of organizing human affairs existed prior to that time: monarchies, principalities, cities, clans, tribes. None, however, possessed the modern state's defining attribute, which is its claim -- not always achieved -- to monopolize the means of violence. Accomplishing that may not sound like progress, but consider the alternative: a world with the instruments of coercion shared among predatory warlords, roving mercenaries, invading hordes, urban gangs, bandits, and pirates. That is what the pre-state era often was like, and it was to provide some semblance of security from the prevailing disorder that states originated.
They certainly did not produce peace. But the organized wars of the eighteenth century were a distinct improvement over what had preceded them -- notably the Thirty Years' War, of 1618-1648, "an anarchic free-for-all of violently changing fortunes," as the historian David Kaiser has described it, which may have reduced Germany's population by as much as half. When, therefore, the early-nineteenth-century Prussian strategist Carl von Clausewitz wrote that war was an extension of policy by other means, he was not so much glorifying war as reacting against its excesses. Having lived through, and fought in, the Napoleonic Wars, he had every reason to know what the unconstrained use of force might involve.
Clausewitz insisted on the control of military conflict: on limiting violence to the minimum necessary to achieve belligerents' objectives. He was sensitive, as few other strategists have been, to war's unpredictabilities. He knew how easily terror, fatigue, and friction can frustrate even the most sophisticated planning. But this fear of chaos -- of losing control -- made all the more compelling Clausewitz's insistence that the initiation and conduct of war should be rational acts, in the sense of maintaining as close a correspondence as possible between the purposes of violence and its scale.
Subsequent wars, especially the two world wars, did not always meet that standard: hence their conduct has often been criticized from a Clausewitzian perspective. But those who started them sought to link available force with intended objectives. "Statesmen have sometimes been surprised by the nature of the war they have unleashed," Sir Michael Howard, one of the most astute students of Clausewitz, has pointed out, "and it is reasonable to assume that in at least fifty per cent of the cases they got a result they did not expect. But that is not the same as a war begun by mistake and continued with no political purpose."
The Cold War, in contrast, was Clausewitzian to the core. With the development of nuclear weapons, the means of violence had swollen to unimaginable proportions; but the great powers maintained such tight control that none resorted to any of those devices. Confronted with the possibility of their use, leaders as dissimilar as Eisenhower, Khrushchev, Macmillan, De Gaulle, and Mao Zedong found common ground in the urgency of living to play the game another day. Clausewitz's insistence that the instruments of violence not overwhelm the uses to which they are put has served us well, therefore. We probably owe our survival to it.
Cold War statesmen behaved so rationally, in fact, that theorists today rely heavily on "rational choice" models in thinking about the future. "Realists" and "neo-realists" assume that states know their interests and will consistently pursue them; a few have even advocated the controlled proliferation of nuclear weapons, apparently on the grounds that if these weapons induced rationality during the Cold War, they will do so at all times and in all places. Political economists, assuming aggregate if not individual rationality, are confident that states contemplating war in a globally interdependent economy will find that they cannot afford it. Democratic peace theory, too, takes rationality as a given. The argument here is that since no democracy has ever gone to war with another democracy, such states must prefer and will therefore choose peaceful over violent means of resolving disputes with each other. The number of democracies is increasing; so, too, should the prospects for peace.
Game theorists have even devised a mathematical concept to simulate these patterns of behavior, and its name is revealing: expected utility. The assumption, quite simply, is that people act only when they anticipate benefits from their actions. And given existing military, economic, and political realities, it is hard to see -- from a rational-choice perspective, at least -- how starting a war could benefit anyone in this day and age.
BUT what if behavior is not always rational? What if there are deeper forces -- rooted in the structure of international politics, or in the cultures that populate the world, or in human nature itself -- that get in the way of calculating expected utility? What if, in assuming rationality, we are continuing to play an old game even as we feel the earth beginning to shake beneath our feet?
Rational-choice scenarios, it is worth noting, assume the continued viability of states. Theorists who favor proliferating nuclear weapons expect them to remain under national authority: no one would want to pass them out to the Aum Shinrikyo cult, or the Montana militia, or some brilliant Unabomberlike loner. Businessmen look to states for the order within which commerce can flourish and contracts can be enforced; there is no rush these days to invest in places like Somalia or Sierra Leone, where such conditions are absent. Democracy could hardly survive if the constitutional protections that states provide were to vanish. If the spread of democracy promotes peace, therefore, that condition, too, requires that states survive and prosper.

Perhaps they will. States are not likely to disappear in the near future, and it is reasonable to expect that they will still be around in some form when the twenty-first century ends. The question is, In what form? Even rational-choice enthusiasts agree that states will not be as powerful as they have been -- that in contrast to the Orwellian nightmares that haunted much of this century, wide areas of human activity in the next one will lie beyond state control. The effects will in some ways be liberating, because states have so often been sources of oppression. But they have also brought stability, and that stability could be the precondition for such rational choices as human beings have made in managing violence over the past several hundred years.
One way to test that hypothesis would be to examine the geopolitical tectonics: to look at how wars were waged in the medieval, ancient, and even prehistoric eras, before states existed. Military historians are doing just that, and what they have found is causing some of them to question the relevance of Clausewitz to the post-Cold War world. John Keegan, whose writings have revolutionized the field, makes the argument most bluntly: "War is not the continuation of policy by other means.... Warfare is almost as old as man himself, and reaches into the most secret places of the human heart, places where self dissolves rational purpose, where pride reigns, where emotion is paramount, where instinct is king." The Clausewitzian view of war, the Israeli historian Martin van Creveld says, "is ... a modern invention ... Having been invented at a certain point in time, there is no reason to think that it possesses some kind of inherent validity, nor that it necessarily has a great future."
These experts are suggesting that if rationality does indeed mean matching the scale of violence to its purposes, then it is not clear who or what in a world of weaker states would perform that function. The historical indicators are not encouraging. For a thousand years following the fall of Rome, Van Creveld points out, "armed conflict was waged by ... barbarian tribes, the Church, feudal barons of every rank, free cities, even private individuals." To view such wars as Clausewitzian makes no sense, for they were "scarcely ... distinguishable from simple rapine and murder." Primitive society was no better: as the anthropologist Lawrence H. Keeley has shown, there were few if any "peaceful savages."
It is too deterministic to say that people are programmed for violence, like some aggressive species of ants. But the archaeological evidence shows that men -- and often women as well -- have been fighting wars for at least 5,000 years. Organized conflict emerged independently in cultures that had little or no contact with one another. It appears to have been as frequent among the inhabitants of pre-conquest North and South America as in Europe, Asia, and Africa. And collective killing goes back much further than that. It evolved initially, the social critic Barbara Ehrenreich argues in Blood Rites, as a defense against being eaten by wild animals at a time when our precursors as a species were themselves making the transition from prey to predator.
If this is true, if violence is that deeply embedded in human nature, then it must be at least as ancient as is the belief in the supernatural. "War appears to be far more robust than any particular religion," Ehrenreich observes, "perhaps more robust than religion in general." The revival of religion over the past quarter century would surely qualify as the sociological equivalent of a tectonic upheaval: this worldwide phenomenon is not what one might have anticipated in a supposedly secular age.
But what about the possibility that as state-sponsored violence declines, individually and culturally based violence may emerge, as unexpectedly as did religion, to replace it? If the human affinity for killing is as tenacious as faith, and if the states that have channeled that instinct into Clausewitzian patterns of rationality over the past several hundred years are, like secularism, beginning to decline, then another tectonic surprise may be on the way. What happened in Bosnia and Rwanda could be only the beginning of a future that turns out, as in geology, to reflect a very distant past.
THE prospect is a bleak one, and we should not accept it uncritically. As the end of the Cold War demonstrated, gloomy scenarios have no monopoly on getting the future right. The peaceful demise of a superpower showed that unprecedented events can occur -- that the past is not always a reliable guide to what is to come. It is by no means certain that the post-state era, if that is what we are entering, will echo its pre-state counterpart; there may be ways of preserving Clausewitzian rationality "by other means," even if states do gradually lose their capacity to perform that function.
International organizations are one possibility. As the Cold War wound down, hopes rose that the United Nations would at last fulfill its promise in resolving old conflicts and deterring new ones. But whether because of ineffective leadership, because of inadequate support from its most powerful members (especially the United States), or because too much was expected of it in the first place, the UN has yet to demonstrate any significant capacity to control large-scale violence over an extended period of time. Failures like those in Bosnia, Somalia, and Cambodia have forced a lowering of expectations. We have a long way to go before the UN can plausibly substitute for states as the keeper of Clausewitzian order.
Regional organizations are more robust, but their priorities are narrow. Even as it expands, the North Atlantic Treaty Organization concerns itself more with exclusion than inclusion. Keeping Russia out seems particularly at odds with the universally acclaimed example set by post-Second World War security structures, which brought Germany and Japan in. The European Union's priorities look equally askew: is creating a common currency really more vital than removing the economic disparities that divide Europe today almost as dramatically as did the old Iron Curtain? In the Asia-Pacific region, where cooperative action failed to prevent an economic crash, pressures are building for a return to controlled markets. Controlled politics have never disappeared there.
Meanwhile, the majority of the world's population remains saddled with economic systems that have failed, or never took off in the first place. But the telecommunications revolution -- which functions in all societies -- is making these have-nots more aware than ever of what they do not have, even as demographic pressures and ecological deterioration endanger the little they still possess. If, as the historian Paul Kennedy has warned, "the continued abuse of the developing world's environment leads to global warming, or if there is a massive flood of economic refugees from the poorer to the richer parts of the world, everyone will suffer." No transnational institution, governmental or nongovernmental, has even begun to address this problem, despite its potential as the greatest of all breeding grounds for violence in the twenty-first century.
For all their good intentions and often impressive accomplishments, international organizations have a common problem: it is that of collective leadership commanding limited resources. As Clausewitz could have pointed out, restraining violence, like unleashing it, requires both capabilities and resolve; these are hard to achieve when many are in charge and the instruments at hand are few. Transnational institutions, then, face their own Catch-22. They may someday be in a position to counter the decline of states and the disorder that will probably follow. But like the old Soviet Union, they will accomplish this task only by ceasing to be what they now are.
IF the institutional approach seems unpromising, what about the opposite end of the spectrum: a change in the behavior of individuals, so that there would be less violence for states -- or their successors -- to restrain in the first place? The idea is not as far-fetched as it might seem. We are, after all, creatures of evolution, and our survival suggests at least limited success in moderating self-destructive tendencies. Violence is not the only characteristic of our character -- and character itself can change.
The political scientist James Q. Wilson has pointed out that regardless of culture, region, or religious belief, most people today would agree on what constitutes an atrocity -- there is a nearly universal sense of horror. Shifts in standards of behavior must have produced this consensus, for it cannot always have been present: societies that once tolerated human sacrifice and slavery, for example, no longer do so. With the twentieth century's quantum leap in the speed and ubiquity of communication, this shared moral sense seems likely to expand.
Political behavior, too, may be changing. The political scientist Francis Fukuyama detects a gradual but irreversible trend toward self-government and away from the old tradition of authority by imposition. The proliferation of democracies, he insists, arises out of a long-term change in human collective consciousness. And the political scientist John Mueller has made the case that attitudes toward war itself are evolving: that in light of the devastation twentieth-century conflicts have caused, the very idea of fighting a war in the twenty-first century -- among the great powers, at least -- will attract more ridicule than respect.
If these trends hold up, we will face some interesting possibilities. New patterns of behavior may evolve in time and with sufficient strength to compensate for the decline of states and the probable ineffectiveness of international organizations. One tectonic force could counter another. Scientists have found that under certain circumstances even inanimate objects -- molecules, crystals, representations of randomness on a computer screen -- have the capacity for self-organization. If some similar phenomenon could work in the world of geopolitics -- if we could "self-organize" rationality without having to rely on states or international institutions to enforce it -- then the prospects for the next century would be a good deal better than one might think.
THERE is, however, another, darker path to order in a disorderly world that few people today want to talk about: "empire" is the form of governance that hardly dares speak its name. Surely, it would seem, we are living in a post-imperial age. European colonial empires have long since crumbled; and even though Soviet and American spheres of influence took on imperial attributes during the Cold War, those structures, too, are mostly part of the past. If economic integration and political self-determination are eroding the authority of states, then these forces ought to be all the more destabilizing for empires, founded as they so often were on the denial of just those principles.
And yet -- a geologist would caution that before we consider volcanoes extinct, or faults stable, we should check the underlying tectonics. After all, states have existed for roughly 500 years, but empires -- like war -- go back almost ten times as long. What assurance do we have that our epoch, which is clearly one of inactive empires, is also the one uniquely privileged to witness the end of empires for all time?
None whatever, if science fiction is any guide. Perhaps it should be, since novelists and filmmakers spend at least as much time as anyone else relating past and present trends to the long-term future. Empires have hardly disappeared from their imagined worlds; indeed, they show up so frequently -- in everything from Isaac Asimov's classic Foundation series to George Lucas's hugely popular Star Wars sagas -- that it is hard to imagine the genre without them. Viewing the future through Darth Vader's eyes may seem, well, slightly flaky. But to anyone who failed to understand the purpose, so would the sight of miners carrying canaries into mine shafts. Early-warning systems must be both impressionable and expressive -- and false alarms by no means render them useless. For these reasons alone we should not too quickly rule out a future for empires.
The historical record here supports the visionaries. No empire has endured, but cycles of imperial consolidation and decline -- the rise and fall of empires -- are one of the few persistent patterns in history. Astronomers know that stars are constantly igniting and burning themselves out. The fact that none ultimately survive is no reason to regard the process that produces them as extinct -- or as irrelevant to the future. Empires, on this planet at least, appear and disappear in much the same way.
What is it, then, that causes them to do so? The arrogance of ambitious leaders, to be sure: Alexander the Great, Napoleon, and Hitler built empires -- and quickly lost them -- through the force of personality. But such instances are relatively rare. Empires have more often arisen from a determination to spread a religion or an ideology, or out of hope for economic gain, or as a response to the prospect of anarchy along one's borders. The first two inducements may be obsolete: in an age of global communication and markets, empires are hardly necessary to disseminate ideas or secure profits. But empires as a method of imposing order -- that is another matter entirely.
For all their injustices, empires have frequently achieved a kind of Clausewitzian rationality. Like states, they have sought to monopolize the means of violence; and because their purposes paralleled one another, empires and states coexisted for several hundred years. Some states, like England and France, transformed themselves into empires; certain empires, like those of the Hapsburgs and the Ottomans, spawned new states. But as democracy spread in the twentieth century, the priorities of empires and states began to diverge.
The democratic state must assume -- even if it does not in every respect ensure -- the equality of those subject to its rule. Empires, in contrast, require inequality: a powerful center asserts its authority over weaker peripheries, at times with their consent, more often without it. That is why, as this "democratic" century ends, there are no traditional empires left. Some of the processes that produced them remain in place, though, and that raises an interesting question about the next century: Is equality or inequality likely to be the dominant theme?
An answer is already emerging, and it is not reassuring. The new laissez-faire economics is distributing wealth in an unprecedentedly unequal manner throughout the world. That, Karl Marx would have said, is what one would expect from capitalism; he expected an international proletarian revolution as a consequence. States proved him wrong by cushioning capitalism's excesses during the twentieth century; had they not done so, the democratization that dominates our era could hardly have taken hold. How will democracy fare, though, if the twenty-first century is one of increasing economic inequality and diminishing state authority?
Suppose Marx should turn out to have been right after all. Suppose unregulated capitalism provokes discontent on a global scale similar to what happened within the industrialized states a century ago. Suppose the anarchy Robert D. Kaplan anticipates in the poorer parts of the world spreads widely enough to alarm the richer parts. Not states as presently constituted, or international organizations, or whatever slow shifts may be taking place in human nature, are likely by themselves to contain such chaos. Empires, however, are a time-tested response to inequality and the unrest it brings -- and the human capacity to package old wine in new bottles (equipped, of course, with politically correct labels) ought never to be underestimated.
WHATEVER the packaging, the advertising will sound familiar: that without stability -- without some method of countering the human propensity for violence -- the prospects for advancing civilization are at risk. The argument has been made so often, and in support of such dubious causes, that we tend to dismiss it as special pleading -- as an excuse, however feeble, for aggression, exploitation, or discrimination. It seems a relic of an earlier age. We like to think that we are beyond the need for hierarchy and all it implies.
But this perspective, too, may reflect a failure to think tectonically. For it is not at all clear that the two great priorities of twentieth-century democratic capitalism -- economic integration and political self-determination -- will alone produce a better world. If, as appears increasingly likely, these undeniable virtues do not always complement each other, if the simultaneous pursuit of both means straddling a fault line, then seismic shocks are sure to come. And it would be arrogant in the extreme to assume that the past, which has witnessed so many upheavals, can offer no useful guidance in preparing for them.
Sir Isaiah Berlin, one of the wisest men of this century, often warned that values are not necessarily compatible: that the simple-minded pursuit of single virtues can subvert others. The essence of politics is the balancing of priorities, and this requires an ecological perspective -- a sense of the whole, along with a sensitivity to how things relate to one another. That is what seems to be missing as we approach the twenty-first century: the willingness to say that there can be too much of any good thing, that setting up self-determination, or free trade, or anything else, as an absolute priority is asking for trouble. It is like preparing for earthquakes only by stabilizing crockery, without worrying about the shelves, walls, roof, and foundation.
With the plate-tectonics revolution three decades ago, geology became an ecological discipline. It was possible for the first time to visualize the earth as a whole, and to understand how processes at work in some part of it could affect the rest. Geopolitics requires a similarly comprehensive perspective: we need to focus our attention as much on the arenas within which games are played as on the games themselves. We are no more likely than the geologists to predict precise outcomes. But we can at least prepare ourselves for Candlestick Park surprises: we can reinforce the bleachers, back up the communications links, mark the exits, and keep the emergency squad close at hand. We may even find a certain satisfaction -- players, fans, and anchorpersons alike -- in expanding our philosophy, and hence our dreams, to accommodate more of the things that are happening, if not in heaven then at least here on earth.