WHY do our best-laid plans go oft astray? Cities enact rent control in order to ensure an adequate supply of affordable housing, which discourages further investment in residential property and makes affordable housing harder to find. We ban certain drugs in order to protect society from their adverse consequences, thereby encouraging an illegal narcotics traffic that enriches criminals, fuels urban violence, and complicates relations with foreign governments. We welcome the globalization of world markets and hail the Asian economic miracle, only to be surprised when a currency crisis in Thailand triggers a financial panic in Korea and Japan.
Why is the human experience filled with unpleasant surprises like these? The main reason, as Robert Jervis makes clear in his intriguing new book, System Effects, is the difficulty of predicting how complex systems will behave. Drawing on a diverse body of scholarly research and a wealth of illuminating examples, Jervis shows that "system effects" are an important and often overlooked part of social and political life. And though he focuses most of his attention on issues of international politics, where system effects are especially prevalent, his insights will capture the imagination of those who puzzle over other social problems as well.
For Jervis, a system effect exists whenever "(a) a set of units or elements is interconnected so that changes in some elements or their relations produce changes in other parts of the system, and (b) the entire system exhibits properties and behaviors that are different from those of the parts." Thus a system can be a physical entity, such as the solar system; a living organism; or a social organization, such as a free market, a neighborhood, or an international alliance. Although the boundaries between different systems are largely arbitrary (for example, individual human beings are systems in their own right and also part of a nearly infinite number of social systems), system effects will be present whenever separate parts are linked in a larger whole.
According to Jervis, system effects occur for three distinct but interrelated reasons. First, in addition to the direct effects of any action, the interconnections within a system produce indirect and delayed effects. For example, altering the federal tax code has a direct effect on government revenues, and it also has indirect effects on investment decisions, consumer spending, and the incomes of accountants and lawyers.
Second, when there are more than two actors in a system, the relationship between any two will be determined not just by how they act toward each other but also by the interactions among the other members of the system. In a marriage, for example, the degree of harmony between husband and wife is determined not only by their own compatibility but also by the presence or absence of attractive co-workers, difficult children, and malevolent in-laws. Similarly, relations between two states are often affected primarily by what other states do -- as when two states form an alliance solely because a third state threatens them both.
Third, relations in a system are interactive rather than additive. Actions may not lead directly to an intended result, because the outcome also depends on how the other elements in the system respond. For example, one might think that acquiring a player like Shaquille O'Neal would make any basketball team stronger, but this strategy would backfire if his skills did not mesh well with those of the other players on the team. In a system the results of any particular move may depend on the moves that other actors make, and actions taken at one point in time may alter the entire system in ways that lead to very different behavior at a later stage. Relations of cause and effect are often murky, and determining who is responsible for what can be nearly impossible.
Because system effects are everywhere, Jervis emphasizes, "we can never do merely one thing." Any step we take will have an infinite number of consequences, some that we intend and others that we neither intend nor foresee. A military buildup may deter a threatening adversary and help to preserve peace, for example, but it may also divert funds from other social needs, encourage one's allies to free-ride, and cause formerly neutral states to become friendlier with one's rivals. The more complex the system and the denser the interactions between the parts, the more difficult it is to anticipate the full effects of any action.
The stability of a given system is often determined by whether the interactions among the parts are characterized by negative or positive feedback. Negative feedback occurs when the elements of a system act to dampen any disturbance (as a thermostat responds to a shift in temperature), thereby tending to preserve the system in its initial state. In contrast, positive feedback reinforces the disturbance and magnifies its effects. "Were it not for negative feedback," Jervis points out, "there would be no stability as patterns would not last long enough to permit organized society. Without positive feedback, there could be no change and growth."
In international politics the classic example of negative feedback is the balance of power. The fear of being dominated usually leads weaker states to join forces against stronger or more threatening ones, which is why trying to amass more and more power can be counterproductive and why it has been so difficult for any single state to establish an enduring hegemony over the others. The same logic inspired the system of checks and balances that is embedded in the U.S. Constitution. To enhance the stability of the new republic and to prevent it from falling prey to tyranny, the Founding Fathers divided power among the various branches of government and made it easy for factions to combine, thereby making it hard for a majority consistently to dominate the rest.
When positive feedback is prevalent, systems do not behave in this self-correcting fashion, and small causes may have very large effects. Just as a single snowball can start an avalanche when conditions are right, an entire neighborhood can be transformed if a minority population reaches a "tipping point" that persuades the predominant group to leave en masse. Positive feedback is central to the familiar domino theory, which argues that a single setback can damage a state's credibility, embolden its rivals, and demoralize its allies, thus making subsequent defeats likely. When positive feedback is at work, in short, instability is the norm, and rapid and unforeseen changes are to be expected.
Unfortunately for would-be social engineers, both kinds of feedback can occur in most systems. This makes it difficult to know which behavior to expect and difficult to predict the effects of a policy change. States in the international system usually balance against powerful aggressors, but occasionally dominoes do fall and states do jump on the bandwagon. Sometimes issuing a threat can trigger an escalating spiral of mutually reinforcing hostility, but at other times a threat can deter a potential aggressor or lead both sides to make a conscious effort to relax tensions. Although Jervis does identify some of the conditions that make positive or negative feedback likely, statecraft remains a difficult art, because it is hard to know in advance which behavior will occur.