Letters to the editor

State of the Union

Jonathan Rauch, in "Bipolar Disorder" (January/February Atlantic), cites my book The Values Divide as a source for much of the commentary as to whether and how deeply we are about cultural values. I am also the unnamed source who stated that Americans live in "parallel universes."

Rauch is correct that political elites are highly polarized on the basis of cultural values. But this is so because Americans have made it so. Today lifestyle choices and values preferences directly correlate with partisanship. Married voters—especially those with children under the age of seventeen, living at home—and frequent church attendees vote Republican in large numbers. Single voters, unmarried couples, and infrequent churchgoers vote overwhelmingly Democratic. Polls taken by Zogby International consistently find that voters in these different demographic categories also make different cultural choices. For example, 43 percent of Bush supporters saw The Passion of the Christ; 65 percent of Kerry backers saw Fahrenheit 9/11. Cable television, Internet blogs, and other technologies that promote market segmentation only serve to ratify voters' instinctive predilections.

Something is amiss in American politics when exit polls showed moderates voting for John Kerry by nine points and independents by one point, and still Kerry lost. Although the polar extremes have always been with us at the elite and mass levels, the absence of a "vital center" makes today's politics different.

Alan Wolfe has described the twenty-first century as an age of "moral freedom." Redefining interpersonal relationships means that Americans have a greater number of choices than ever before. It is in these choices that politics, partisanship, and values have become intertwined. The result is an electorate whose highly polarized nature shows up in polling data in race after race. As Tom Robbins wrote in his novel Even Cowgirls Get the Blues, "Until humans can solve their philosophical problems, they're condemned to solve their political problems over and over and over again. It's a cruel, repetitious bore."

John Kenneth White
The Life Cycle Institute
The Catholic University of America
Washington, D.C.

In "Bipolar Disorder," Jonathan Rauch makes an eloquent and compelling case that the American nation isn't as polarized as the national parties. He also identifies the major causes of partisan polarization—gerrymandering and the selection of candidates by ideological activists rather than the local party machines of the old days. However, his conclusions—that partisan polarization can be beneficial and we should learn to love it—do not follow from his analysis. The present system, created only in the past thirty years, can be changed.

The most radical change would be to abandon our eighteenth-century first-past-the-post voting system, which tends to produce two-party cartels, for some version of proportional representation, which most modern democracies use to elect their legislators. If that seems too exotic, there is the alternative vote or instant-runoff voting (IRV), which is now used to elect city officials in San Francisco. Voters rank candidates in order of preference, and if nobody initially wins 50 percent, then second- and third-choice votes are redistributed until a winner emerges. Unlike the first-past-the-post system, IRV rewards moderate candidates who appeal to more than one party.

Changing electoral rules nationwide is a long-term project. For the foreseeable future we are stuck with first-past-the-post and two major parties. Why not change the Republicans and the Democrats back into what Rauch says they once were—"loose coalitions of interests and regions" rather than "ideological clubs"? If primaries dominated by zealots are the problem, why not abolish primaries? We don't have to go back to the old system in which party bosses chose the candidates. The California gubernatorial-recall election, which installed the centrist Arnold Schwarzenegger as governor, showed us another way. Why not let the voters choose from many candidates, both partisan and independent, on the first ballot of a two-round election? There could be an actual runoff election a few weeks later among the top two vote-getters. Or an IRV ballot could produce the same result as a runoff, without the time and expense of a second election. Replacing party primaries with the first round in an actual or virtual runoff system would open up the ballot to new candidates and movements; it would reward centrist candidates with the broadest possible appeal and punish divisive ideologues; and it would tend to make gerrymandering futile.

Michael Lind
The New America Foundation
Washington, D.C.

Despite what Jonathan Rauch argues in "Bipolar Disorder," there is clear evidence of an increased political divide in American public opinion. In the latest survey by the Pew Research Center a majority of Americans perceived a growing divide both in the nation generally and among people they know. That perception stems partly from close political elections but especially from differences over the war in Iraq, which not only has driven a wedge between Republicans and Democrats but has intensified the partisan gap over fundamental national-security questions, including the appropriate use of force and America's place in the world.

The Pew Research Center's longitudinal measures on basic political, economic, and social values, which date back to 1987, show that political polarization in public opinion is as great now as it was before the 1994 midterm elections, which ended four decades of Democratic control in Congress. Partisan divisions are not in themselves new, but the basis for today's divide is considerably different from that of the divide that existed a few years ago. During the 1990s attitudes toward government, welfare, and business—and also homosexuality—were the strongest predictors of a person's party preference. Today opinions about the efficacy of force versus that of diplomacy and the obligation of Americans to fight for their country are by far the strongest predictors. The partisan gap in basic national-security values has never been more pronounced. Attitudes toward the Iraq War color opinions on a number of security issues. Republicans have become decidedly more militant, while Democrats, if anything, have become less so.

The widespread hostility Republicans felt toward the federal government in the 1990s has dissipated now that their party controls all branches of that government. Democrats have become much stronger advocates for government's social safety net, and thus differ more sharply with Republicans over related issues.

Interestingly, the partisan gap on most social issues, while substantial, has not increased in recent years. Over the past decade the political spectrum has shifted decidedly in favor of tolerance on issues relating to homosexuality and race. Similarly, the partisan gap on abortion remains large, but it has not grown substantially since the 1990s. The public's values are not simply a portrait of partisanship; a number of consensus values endure. Nevertheless, the points of public agreement on major subjects have been largely overshadowed by differences over national-security issues and America's place in the world.

Andrew Kohut
Pew Research Center
Washington, D.C.

Stephen S. Cohen and J. Bradford DeLong ("Shaken and Stirred," January/February Atlantic) are correct that America is entering a new economic era that will pose a greater threat to the security of our middle class than what has come before. But I think they attribute too much of this to the offshoring of services, they're too pessimistic about the possibility that high-wage jobs will replace those that are lost, and their suggested remedies are way too modest given the challenge ahead.

While it's true that service jobs constitute 83 percent of our non-farm employment, most of these jobs are in person-to-person services such as retail, restaurants, hotels, hospitals, surface transportation, education, child care, elder care, and the construction trades. These won't go offshore, because they require direct contact between provider and recipient. The biggest threat to jobs like these comes not from foreigners but from new software applications. Automated supermarket checkout lines, airport e-ticket kiosks, online brokerage services, and similar innovations will continue to push workers toward giving customers more personalized, less standardized attention.

Yes, many professional jobs at the higher-wage end of the middle class are or will soon be vulnerable to the twin threats of new software and low-cost foreign professionals. But this trend doesn't necessarily mean fewer knowledge-intensive jobs in America. As professional services become low-cost commodities, demand shifts to more complex and tailored high-end services. Stockbrokers become personal financial consultants. Software programmers become business consultants offering customized applications. Architects spend more time designing, less time drafting. More of what end users seek, and pay for, involves getting exactly what they want: good advice and customized fits, new ideas about how their unique needs can be met by special applications, and implementation of such ideas. Workers with such skills are and will be in ever greater demand.

As to what should be done, by all means let's equip more of America's middle class to be agile, as Cohen and DeLong suggest—not only through portable health and pension benefits and career-transition assistance, but also with wage insurance! But if middle-class Americans are to be capable of doing the knowledge work of the future, we also have to make sure they get the education they need, starting in early childhood, continuing in K through 12 with small classes and well-trained teachers and good after-school programs, and extending through at least two years of college. And we have to provide them with affordable health care—not only as a good in itself, but also because healthier people are inherently more productive. In all these respects, I fear, America is now heading in exactly the opposite direction.

Robert B. Reich
Former Secretary of Labor
Brandeis University
Waltham, Mass.

As Stephen Cohen and J. Bradford DeLong generously acknowledge, I have spent the past several years examining the economic risks facing American families, using a combination of historical and programmatic evidence and the best available statistical data. My key finding is simple: on a variety of measures, Americans' economic lives are much more insecure than they were twenty or thirty—or even ten—years ago. It is well known that economic inequality has risen since the early 1970s. Yet my evidence shows that the growth in the instability of family incomes—how much they vary over time—has far outpaced the growth in inequality. And not only do Americans' incomes bob up and down more than they used to, but, according to my calculations, when incomes fall, families frequently find themselves in dire financial straits.

What is especially striking about these trends is that they occur against the backdrop of a tremendous social change that Cohen and DeLong conspicuously neglect to mention: the growing economic role of women outside the home. As the economists Maria Cancian and Deborah Reed have documented, the proportion of families in which women work hours comparable to men's more than doubled from 1970 to 2000—to more than 60 percent. This huge shift helped buoy family incomes during a period when wages for most men remained close to stagnant. As my evidence shows, however, the instability of family incomes also increased dramatically. The two-worker family is the ultimate in private risk-sharing. Yet it seems that not even this most fundamental of insurance arrangements has been able to cushion against shocks to family incomes.

Nor has government or the corporate sector stepped into the breach. To be sure, some programs have been expanded. But the broader trend within the public sector—one that President Bush currently hopes to further by partially privatizing Social Security—is toward ever greater private risk management. When I looked beneath the statistics, I was shocked to find that the cushioning effect of public taxes and transfers on families experiencing drops in income, significant in the early 1970s, had all but evaporated by the late 1990s. And, of course, corporate America is running away from the ideal of security even more quickly. Out with guaranteed retirement benefits, in with 401(k)s. Out with generous health coverage, in with high-deductible policies—or no policies at all.

What can be done to address these troubling trends? Cohen and DeLong say less than they might in response to this crucial question. But it is clear that the first rule should be Hippocrates': Do no harm. Whether the misguided idea is Social Security privatization, medical savings accounts, or a new tax system that undermines the income insurance provided by a progressive code, defenders of economic security should ensure that government does not add to the growing insecurity Americans already face. Our economy is more open than ever, and that openness should be preserved. But it needs to be coupled with a new generation of insurance protection against collective risks that citizens can't effectively deal with on their own. Too often our public sector parcels out risks one by one to a series of disjointed, categorical programs that inevitably leave glaring gaps. In an age of rapid change we need a general program of income insurance that provides catastrophic coverage against a wide range of risks—not just to a favored few but to all Americans. This universal insurance could in turn be paired with new measures to encourage saving and wealth on the middle and lower rungs of the economic ladder. Call it an ownership and insurance society for all Americans.

Jacob S. Hacker
Yale University
New Haven, Conn.

It is gratifying to see concerns about the impact of global competition on living standards so well articulated by Stephen Cohen and J. Bradford DeLong. At the Economic Policy Institute we, often alone among economists, have raised these issues regarding blue-collar workers for years. One can't help wondering if some of this recent attention derives from the fact that global trade now has the potential to hurt not only those in old, dirty industries but also those in high-paying white-collar jobs—that is, folks like us.

But there's also a theoretical rationale. For decades economists have wielded the blunt ax of trade theory to push back against those of us who argued for addressing the damage inflicted on American workers by trade. Now, as the Nobel laureate Paul Samuelson warns, our competitors are coming after our comparative advantage (highly skilled workers), and the predicted outcomes from globalization are not nearly so unequivocally positive.

That said, I'd like to raise a couple of potentially hopeful points.

In a piece with many unsettling predictions, Cohen and DeLong make a particularly pessimistic claim: "The gold standard for rapid growth and shared prosperity"—the few decades following World War II—was "probably an aberration, a confluence of events … unlikely to be seen again."

These guys know their economic history, and they may be right, but I fear that they're glossing over a recent period that was both unique and instructive. In the late 1990s fast growth was accompanied by very tight labor markets and broadly shared prosperity. Productivity accelerated, unemployment fell to its lowest rate in thirty years—four percent in 2000—and, much as it did in the "golden era," full employment ensured that the benefits of faster growth were broadly shared.

What's more, two important factors were different in the 1990s. First, union power was much less significant than in the 1950s and 1960s, and second, our work force was far more exposed to global competition than it had been in the postwar period. In fact, trade (imports plus exports) as a share of the economy was 26 percent in 2000—the highest on record. Both these factors pushed hard against the impressive distributional results of the late 1990s, yet there's no denying that those results occurred.

Is there a lesson from this period that can be applied to the impending challenges identified by Cohen and DeLong? The strong demand of the 1990s was itself partially an aberration, caused by bubbles in key markets. But that conflates the source of demand (speculation) with its impact (for example, strong labor demand and income growth, tame inflation). The important point is that contrary to previous economic dogma, we can run a truly full-employment economy without overheating. In fact, it is a necessary condition for better distributional outcomes.

Can we reach full employment in a world with so much global competition? Won't it take that much more labor demand to absorb the trade-induced increase in labor supply? Cohen and DeLong argue that regardless of trade flows, the Federal Reserve sets the unemployment rate. In that regard the fact that Greenspan and Co. are currently raising interest rates with unemployment in the mid-fives doesn't bode well. On the other hand, Cohen and DeLong argue that greater openness will lead to faster growth, and that, too, could help to lower unemployment. But Samuelson's recent work on the unique challenges of offshoring skilled services raises some doubts about the conventional wisdom that greater openness always boosts growth.

The bottom line is that although we probably can't count on getting back to truly full employment in the near term, it's none too soon to start thinking about the necessary policy tools to get us there. In the context of Cohen and DeLong's compelling arguments, think of full employment as a critical offset to some of the economic insecurities caused by globalization.

A related point takes off from their brief discussion of steps we should take to ameliorate the problems they elaborate. They advocate "investments in retraining and rebuilding," "far more career-transition assistance," and "perhaps more government funding" for health care. Good ideas, all—but let's lose the "perhaps" and specify a truly comprehensive safety net for workers displaced by these trends.

Second, as the authors astutely stress, a much bigger problem than actual job losses is the downward pressure on white-collar wages. This calls for a far more ambitious policy response: direct investment in the creation of high-quality jobs, a particularly straightforward way for winners to compensate losers. It might take the form of R&D in high-end industries such as medical research and information technology, which has the added benefit of potentially opening up new areas of comparative advantage.

The theme is to spend at least some of the gains from trade on both ameliorating the dislocations it engenders and directly investing in quality jobs for affected workers.

Jared Bernstein
Economic Policy Institute
Washington, D.C.

I'm standing on the diving board, trying to jump headfirst into William Powers's optimistic take on the niching of the media and the attendant polarization of their audience ("The Massless Media," January/February Atlantic). Unfortunately, I take issue with his assumption that partisan broadcast media have the same intellectual moorings as print. It's mostly an issue of mindful partisanship versus mindless partisanship. The partisanship of the media in the 1800s was a free-for-all of political opinion—or so I'd like to think—in which every passionate reader put in the mental effort to read and weigh the thoughts expressed. But the engagement levels for TV-watching … well, we do put thought into our American Idol votes.

Today 80 percent of Fox News viewers—and lesser majorities of other TV viewers—believe one of three untruths about the Iraq War, as opposed to a minority of (ahem) print consumers. So it's not a question of partisan outlets' making the debate more vigorous; it's actually the opposite—TV thrusting at us vigorous, opinionated voices who (theoretically) have done the deep thinking themselves.

Matthew T. Felling
Center for Media and Public Affairs
Washington, D.C.

Will Iran Be Next?

James Fallows's article "Will Iran Be Next?" (December Atlantic) usefully discusses the dangers involved in attacking Iran's nuclear program. Fallows also points out that an air strike against Iranian nuclear facilities would be less likely to succeed than the 1981 Israeli attack against the Iraqi Osirak reactor, because of the likely concealment and dispersion of Iranian nuclear facilities.

I agree with Fallows that Iran is likely to have concealed and dispersed its facilities, and that such countermeasures substantially complicate military plans. However, the success of the 1981 Israeli attack in delaying the Iraqi nuclear-weapons program has been greatly exaggerated. The French-supplied reactor at Osirak was not well designed for plutonium production, the pre-attack Iraqi route to building a nuclear weapon. Further, by 1981 the French had decided to supply the Iraqis with a special nuclear fuel that could be used to run the reactor but was not well suited for plutonium production.

More important, a rigorous inspection regime was in place to ensure that plutonium could not be produced and secretly diverted to a weapons program. The International Atomic Energy Agency was in the process of installing an extensive inspection regime that would probably have included twenty-four-hour camera surveillance and frequent on-site visits from IAEA inspectors (the reactor was not yet operative at the time of the attack). The French themselves had technicians on hand who filed frequent reports. France opposed Iraq's acquiring nuclear weapons, and would have suspended the supply of reactor fuel if evidence of plutonium production had been uncovered. The diversion of plutonium would have been difficult to conceal, given that it would have involved a number of non-routine activities, including possibly shutting down the reactor. Imad Khadduri, a former scientist in the Iraqi Atomic Energy Commission under Saddam Hussein, bluntly declares in his recent memoir that the idea that plutonium could be produced under this inspection regime without tipping off IAEA inspectors or French technicians is "delusional."

Rather than delaying the Iraqi nuclear-weapons program, the 1981 attack may actually have accelerated it. The attack appears to have heightened Saddam's interest in acquiring nuclear weapons. After the attack Saddam started an underground nuclear-weapons program, unbeknownst to the international community and hence free from the fetters of IAEA inspection.

Given that Osirak is supposed to be the prototypical success story of preventive attacks against a rogue state's nuclear program, this episode should give considerable pause to advocates of future preventive strikes.

Dan Reiter
Atlanta, Ga.

I was surprised that one option was not even mentioned, let alone discussed, in James Fallows's interesting article. That option is the nuclear deterrent strategy that was highly effective against the Soviet Union—once seen as an implacable enemy, an "evil empire."

The United States, with its overwhelming nuclear superiority, could make it known to Iran (and North Korea) that any use of nuclear weapons by Iran—whether against the United States or against any U.S. interest, including Israel, Turkey, Afghanistan, or Saudi Arabia—would be met with a nuclear response so overwhelming that it would effectively destroy Iran as a functioning society. Iran could not doubt America's capability in that regard, and though some question might remain about America's willingness to let the nuclear monster loose, the Iranians might find the idea credible with George Bush as president. In any event, the mere possibility of wholesale destruction as a consequence of using a nuclear weapon must surely give pause even to a madman. The historical record seems to show that it does.

This idea does not address the very real possibility that Iran could act indirectly, as suggested in the article, helping al-Qaeda or Hizbollah or some other terrorist group to stage a nuclear attack. Once again, however, the Bush policy of "either you're with us or you're against us" could make clear that the faintest hint of cooperation with a terrorist nuclear attack would also result in wholesale retaliation. The Iranians might then hesitate to provide such support, and might even attempt to police their terrorist friends for fear that Iran could be implicated in any terrorist nuclear attack.

Arthur Z. Moss
Wilmington, Del.

James Fallows neglects two points that considerably affect his thesis.

First, the Osirak reactor that was bombed by Israel in June of 1981 was explicitly designed by the French engineer Yves Girard to be unsuitable for making bombs. That was obvious to me on my 1982 visit. Many physicists and nuclear engineers have agreed. Much evidence suggests that the bombing did not delay the Iraqi nuclear-weapons program but started it. For example, the principal Iraqi scientist, Jafar Dhia Jafar, was asked by Saddam Hussein to work on the bomb only in July of 1981.

Second, Fallows fails to recognize that Iran is now in compliance with the Nuclear Non-Proliferation Treaty, after having failed to provide details of the uranium-enrichment program when it should have. The protocol Iran has declined to sign is an additional protocol that is not a part of the treaty itself. More important, the principal proponent of the treaty, the United States, has been in violation of the treaty almost continuously since its inception. The United States is continuing to develop new types of nuclear weapons and failing to disarm to the extent most scientists believe is desirable. The United States has refused to sign the test-ban treaty. The United States is also violating Title VI by failing to help non-weapons states use nuclear energy for peaceful purposes, such as electricity production.

James Fallows and the U.S. State Department may not understand these matters, but any non-nuclear state that feels threatened by a neighbor or by the United States certainly does. Rhetoric about failing to follow the NPT is rightly perceived as insulting. If we wish to dissuade Iran from making nuclear weapons, then we must somehow find a peaceful way to persuade the Iranians that not making such weapons is in their interest and not merely in ours. We must recognize their sovereign rights and their legitimate pride. The threat of bombing is not enough, and is probably counterproductive.

Richard Wilson
Mallinckrodt Research Professor of Physics
Harvard University
Cambridge, Mass.

As a member of the House Armed Services Committee, I was troubled by one element of James Fallows's report. In the war game the mock CENTCOM commander, Sam Gardiner, stated, "We have no intention of getting bogged down in stability operations in Iran afterwards. Go in quickly, change the regime, find a replacement, and get out quickly after having destroyed—rendered inoperative—the nuclear facilities." In response Fallows rightly asks, "How could the military dare suggest such a plan, after the disastrous consequences of ignoring stability responsibilities in Iraq? Even now, Gardiner said after the war game, the military sees post-conflict operations as peripheral to its duties. If these jobs need to be done, someone else must take responsibility for them."

War games allow us to learn important lessons by anticipating the future. But it's a lot easier to learn lessons from the proven past.

During a recent visit to Iraq I asked one general to explain his job. He responded, "My job is to provide security. But security has many pieces: nation-building, democratizing, infrastructure development." When I asked him how he prepared for this use of "soft power," he said, "I've been trained as a warrior, not a nation builder."

The lack of post-conflict preparation undermined us in Iraq and, if repeated, will undermine us elsewhere. The Pentagon needs to move past its twentieth-century myopia on strategic, tactical, and policy consequences.

Even some who argue passionately about a "military transformation" can't break out of the twentieth-century mindset. They limit their transformational views to hardware: machines, technologies, weapons that are faster, lighter, more lethal. But Iraq proves that we also need a cognitive transformation: understanding the cultures we're fighting in, speaking foreign languages, developing better human-intelligence capabilities. To accomplish that transformation we must do three things.

First, reform professional military education to give our soldiers a deeper investment in post-conflict operations, counterinsurgency, military history, foreign cultures, languages, psychology, civil affairs, policing, and more.

Second, remove the obstacles to military careers in civil affairs, psychological operations, and foreign area studies.

Third, invest more money in the underfunded and undervalued military agencies that understand how our enemies are exploiting the convergence of demographics, governance, disease, disconnectedness, tyranny, and religious fundamentalism.

That general in Iraq was right. Security has many pieces. And like it or not, the military will be called on to pick up the pieces after force is exerted. It shouldn't take a war game to teach that lesson. And the lesson we learned in Iraq was not new. We should have learned it from World War II.

Representative Steve Israel (D-N.Y.)
Washington, D.C.

James Fallows replies:
I appreciate Dan Reiter's elaboration on the circumstances of the Osirak raid in 1981. It underscores a point made by the participants in our exercise. Everyone agrees that the conditions for launching a successful pre-emptive strike were more favorable when Israel did it a quarter century ago in Iraq than they could be now for the United States in Iran. The targeted nuclear program was more primitive. The distance for attackers to travel was shorter. The element of surprise was greater. There were fewer facilities to strike. And if, even with these advantages, the Osirak raid was of questionable value in slowing Iraq's nuclear program, the prospects for an effective strike against Iran look all the worse.

The question posed to our panel was whether the United States had a realistic way of forcing Iran to give up its nuclear programs. Within the obvious limits of our exercise the answer was no. A pre-emptive air strike would probably be ineffective, and a full-scale land invasion would be harder than what America has undertaken in Iraq, especially for an American military that already has too much on its hands. Therefore our panelists concluded that the United States had only two realistic alternatives: using diplomatic and economic pressure to persuade Iran to change course, or finding ways to live with a nuclear-armed Iran (as we now live with nuclear-armed Pakistan and India). A strategy of deterrence, like the one Arthur Moss lays out, would be an important part of the latter option—if the former fails.

Richard Wilson may not have grasped the premise of the article. It was based on a hypothetical future showdown between Iran and the United States; it was not an exact report on current events. (A number of Arab press reports were based on a similar misunderstanding. They discussed our article as if it were a real, investigative account of real deliberations about real plans for attack.) Nonetheless, the conclusions our panel reached are very similar to those Wilson states in the final paragraph of his letter.

I agree with Representative Israel's observations about the modern responsibilities of the American military. For the purposes of our exercise we assumed that the (mock) commander of Central Command, Sam Gardiner, had been asked by the (mock) president to provide "military options" for dealing with Iran. This happens all the time in real life. As a way of deciding whether a certain course of action is even worth considering, a president may ask to see what the costs and consequences of the approach would be. In his role as commander of CENTCOM, Gardiner assumed that the president would not be satisfied with an answer that called for 400,000 troops and a five-year occupation and would want to see a more "manageable" option. This is the one he presented—and one the panelists opposed from the start, for reasons similar to those Israel outlines. For different reasons, it seems likely that real-world panelists would also reject plans calling for a much larger invasion force.

In the original article Sam Gardiner explained why he placed a high value on war games. His experience at the National War College and elsewhere had shown him that one such exercise, with all its constraints and artificiality, provoked as much useful thought as a large number of lectures or ordinary classroom discussions. In reaction to our article we have seen responses concerning many of the themes raised by the panel: for instance, articles in Armenia and Azerbaijan about whether the United States would or should develop bases there to support an invasion, and discussion in many countries about the leverage Iran might exert in Iraq if it felt an invasion was near. It was in order to stimulate such consideration of America's choices in Iran that we published the article.

Among the Hostage-Takers

I found Mark Bowden's article "Among the Hostage-Takers" (December Atlantic) an enlightening look at modern-day Iran. To make one correction, however, the provisional government would have been interested in negotiating for spare parts for F-14 Tomcats, not F-16 Falcons. The F-16, which would later be widely exported to (and often manufactured by) U.S. allies throughout the world, is a more modern airplane of a different type. The F-14 was then the U.S. Navy's premier interceptor, optimized for multiple over-water intercepts in places like the Arabian Gulf. It was a strategic asset at the time, critical for a potential hot war during the Cold War, and was exported to no other country but our staunch ally pre-revolutionary Iran. Only the United States could provide the necessary parts and maintenance to keep this immensely expensive asset in the air. It was a significant U.S. bargaining chip. A healthy F-14 fleet might have made the difference in the unsuccessful bloodbath with Saddam Hussein.

Jon A. Skinner
Eagle River, Alaska

The joke apparently is on Mark Bowden. The thumbs-up sign is not the same in Iranian culture as in our culture. It is equivalent to showing our middle finger—not quite the meaning the author believed.

Parvin Baharloo
Seattle, Wash.

Mark Bowden replies:
Parvin Baharloo may be correct about the traditional significance of a thumbs-up gesture in Iran, although I don't know for sure. But there is no doubt whatsoever that the Revolutionary Guard who made the gesture to me as he said "Okay, George W. Bush!" was not conveying an insult. I had been talking to him and his companion for hours, and their sympathies were clear.

Advice & Consent

In her letter to the editor in the December Atlantic, Carol de Lamadrid writes, "George W. Bush's success as governor of Texas, and his reputation as a 'uniter,' were due to the vast experience and political skill of his lieutenant governor, Bill Hobby, a Democrat and a legend in Texas."

I suspect that she meant to credit Bob Bullock, who served two terms as lieutenant governor from 1991 to 1999.

Bill Hobby served as lieutenant governor from 1973 to 1991; George W. Bush was elected governor in 1994.

Joe Bowbeer
Seattle, Wash.

I have one serious problem with Bruce McCall's "Nobel Prize Claim Form" (December Atlantic): no Nobel Prize is awarded in mathematics. The closest analogue to a Nobel Prize in Mathematics is the Fields Medal, which is awarded every four years and includes a cash prize of $15,000 Canadian.

Michael J. Walsh
Somerset, Mass.

Readers are invited to discuss Atlantic Monthly articles and other topics in our subscribers-only forum, Post & Riposte (www.theatlantic.com/pr/).