Letters to the editor

More
Declaring Victory?

I have long admired and valued James Fallows’s work, and so I write the following with respect and some reluctance. Mr. Fallows interviewed me over the summer for his article “Declaring Victory” (September Atlantic). We had a long discussion, and I provided him with a copy of a piece I had written giving an assessment of the war on terror after five years, based on what I believe al-Qaeda’s current perspective is likely to be.

Mr. Fallows refers to that piece in his article, and the quotes he provides from it are rendered accurately. I would like to note for the record, however, that what I wrote in my assessment in no way supports Mr. Fallows’s conclusion that “We Win.” The entire thrust of the piece I wrote leads to a conclusion that is 180 degrees opposite of Mr. Fallows’s—in essence, “We’re losing.”

With respect for Mr. Fallows and all of the experts he interviewed, I would submit that a better title for Mr. Fallows’s article would have been “Whistling Past the Graveyard.”

Michael F. Scheuer
Falls Church, Va.

While most of James Fallows’s article makes sense, I believe he is mistaken when he states that “loose nukes” are “the one true existential threat to the United States.” A reasonably well-coordinated nonnuclear attack by terrorists on this country’s transportation system is feasible and would have devastating effects—subjecting millions to potential starvation, for instance. Or again, our limited number of remaining petroleum refineries constitute a target set that is vulnerable to rather crude, easily delivered nonnuclear weapons. Protecting refineries from this threat is probably impossible, and such an attack would severely damage our consumer economy. We could expect to be returned to approximately the economic level of 1910.

James Wooster
Lake Tapps, Wash.

James Fallows makes two related arguments. First, that Americans have less to fear today from a terrorist attack than we did in September 2001; second, that our government’s response to such actions poses a greater risk to our nation than the attacks themselves. In light of recent events in London, his first proposition is getting more attention. But the truth of Mr. Fallows’s second point does not depend on the truth of the first; and it is the more important of the two. Why is it that we, the United States of America, persist in responding to terrorist attacks with behaviors that so obviously do not favor our survival? How have we, as a free citizenry, become so easy to manipulate into patriotic foolhardiness? And how has military action become America’s first resort, instead of our last?

One possibility is that military fool‑ hardiness is an unexpected consequence of the all-volunteer military. As our domestic economy has hollowed out, there are now fewer opportunities for making an honest living. A young American can now aspire to careers in symbol manipulation (finance, marketing, computer programming, or online poker), but rarely in manufacturing or family farming. The U.S. military—our uniformed services and our weapons industries—is the employer of choice for a growing percentage of our wage earners. Thus, we are naturally less inclined to welcome criticism or doubts about its centrality to our culture. And with no one serving against his or her will, the natural gung-ho spirit of the career military has no offsetting cynicism from the unwilling grunts and their families.

John S. Detwiler
Pittsburgh, Pa.

James Fallows demonstrates the need for caution in characterizing major events and historical eras. By declaring, repeatedly, that the United States is engaged in a “war on terror,” politicians and pundits have entrenched the concept deep in the national psyche. Like the terms Kleenex and Post-it, the phrase has been “sold” as an unquestioned, official state of being.

To uproot this mind-set—that is, to convince the citizenry that the war is over and that we have won—we would need a flexible, reflective, informed audience. Five years of war-talk saturation and conditioning have probably not prepared the American public for substantive change. Ordinary folks have become mentally comfortable in the familiar Cold War rut of angst and fear.

This situation requires a persuasive, credible, informed, and intelligent person who, by example, summons trust and commitment. Few leaders match that profile today. But then I remembered An Inconvenient Truth, and Al Gore. Is that also Fallows’s thought?

Frances Monteverde
Austin, Texas

James Fallows is undoubtedly correct that the U.S. could move beyond the war-on-terror mind-set by declaring victory and putting the threat of terrorism in perspective. But this is not going to happen, at least not in the next two years. The Bush administration learned right after 9/11 how easily it could use fear to manipulate the public. It’s not going to forgo a strategy that it hopes will ensure political victory in 2006 and 2008. Instead of saying that “we could use a leader to help us understand victory and its consequences,” Fallows should have focused on how the leaders we have will continue to do just the opposite.

Ralph H. Brock
Lubbock, Texas

James Fallows replies:

I like and respect Michael Scheuer, and his standing to speak in this field is unquestioned. As mentioned in my article, he was for years the head of the CIA’s (now-abandoned) anti–bin Laden unit, and his books Through Our Enemies’ Eyes and Imperial Hubris have stood up well. Of course I cannot tell him what he really thinks, but I believe that the full analysis presented in my article is not so totally at odds with what Mr. Scheuer has written and said.

As a reminder, my chain of reasoning for “declaring victory” was this:

“Al-Qaeda Central” itself, the organization run by Osama bin Laden and responsible for the 9/11 attacks, has been seriously harassed and interfered with in the years since then. This reduces the risk of a devastating, large-scale assault that would amount to “another 9/11.” The airline-bombing plot foiled by the British this summer actually illustrates this change. Whether or not the cell that planned the attack was directly related to al-Qaeda, British police knew about their operations for months and could disrupt them when they chose.

Around the world, “copycat” or “self-starter” groups inspired or motivated by al-Qaeda will continue to pose a serious risk, as has already been demonstrated in England, Spain, India, and elsewhere. But—as events in those same countries show—the groups do not pose an “existential” threat to those societies. Nor will they to the United States, even if and when they succeed in carrying out an attack here. Indeed in the long run the greatest menace of their terrorism is in provoking societies to destructive overreactions (for instance, the United States being provoked by 9/11 to invade Iraq, which by all accounts has become the major training ground and rallying point for anti-U.S. extremists).

Therefore the best defense against the ongoing threat from such groups is to deny them sources of support; to carefully judge our response to provocations; and to emphasize the tools of intelligence, surveillance, and penetration that have historically proven most effective against them. The first area is where, as Mr. Scheuer says, the United States is most clearly losing,” by alienating Muslims worldwide and jeopardizing its moral standing. Many of its mistakes come from excesses made more likely by the state of war (for instance, Guantánamo detentions, perceived as an us-or-them showdown between the United States and the Islamic world). But in all areas, the concept of an open-ended war on terrorism” has outlasted its usefulness, for reasons elaborated in the article. The U.K. bombing plot, again, was broken by patient surveillance work, not by speeches about making war on Islamic fascism.

Mr. Scheuer might still challenge this formulation. But it shows how recognition of failure in one area, and recognition of ongoing threat, can be reconciled with declaring a successful end to the original war against Al-Qaeda Central itself.

I agree with James Wooster that nonnuclear attacks could make life difficult in America. In his excellent Global Guerrillas blog, John Robb (whom I mentioned in the article) has written extensively about the way such “system disruptions” allow small, weak groups to damage big, powerful targets. This underscores the importance of maintaining the right efforts to penetrate, destroy, and deter such groups.

I also agree with John Detwiler that reliance on an all-volunteer military has been a crucial part of the political calculus of this war. Those Americans in uniform, whether on active duty or in the Reserves or National Guard, have been called on for extraordinary sacrifice over the last five years. But they represent well under 3 percent of the total U.S. population. The rest of us have not even had to pay higher taxes for the wars in Afghanistan and Iraq or the larger war on terror. Practically speaking, there is no chance of restoring a military draft. But its absence obviously makes it politically easier to go to war.

As for Ralph Brock’s point, over the last five years I have done all I could to examine the way the leaders we have” made the choices they made. And to Frances Monteverde I say: the imagined presidential oration that closed the article was meant to set a general standard rather than identify a particular candidate. But if I’d had any real-world figure in mind, it would have been Dwight Eisenhower.

Picturing Tibet

I strongly protest the caption accompanying your Photo Op (“Go West, Young Han,” September Atlantic). The new Tibet railroad is indeed an “engineering marvel,” but at the same time a moral disaster. I have seen this railroad during its construction phase, but I have also seen the plight of the Tibetan people, who suffer under the yoke of the Chinese oppressors and who are largely denied the opportunity to benefit from this engineering marvel.

China brutally invaded Tibet and subjugated the Tibetans. The Chinese ransacked monasteries and destroyed untold volumes of Buddhist literature, sacred scriptures, and religious objects—all under the guise of “liberating” the Tibetans, but with the real intent of destroying the ancient Tibetan culture and religion. Having failed at this, they now plan to overwhelm the native society with Han immigrants, effectively perpetuating the servile status of the Tibetans. The new railroad will facilitate this strategy, and it will also allow the Chinese to mine the region’s rich material deposits and transport them to the east, again without benefiting the Tibetans.

Paul W. Rosenberger
Manhattan Beach, Calif.

Camelot Revisited

Whatever his changes of view on other matters, concerning moderately left-of-center politicians in Britain and the United States Christopher Hitchens has been consistent. To him, they are all contemptible. For example, Neil Kinnock, Michael Foot, Bill Clinton, and John F. Kennedy have all gotten the back of his hand—in Kennedy’s case (“Feckless Youth,” September Atlantic), several times. Toward conservative politicians—Dwight Eisenhower for one, Hitchens is frequently more indulgent, giving them credit for what he considers their morally correct actions and setting aside any actions that don’t conform with the picture of conservative wisdom he is trying to paint.

Readers of Hitchens’s article who were not alive and sentient in the United States in the early ’60s would not have the faintest inkling that John Kennedy was, in the context of his times, a liberal president. He supported breaking the House Rules Committee’s power to block legislation. He supported federal aid to education, Medicare, and a tax cut (then supported by Keynesian liberals). He was late to civil rights, but his Justice Department, staffed with young liberals, did integrate the University of Mississippi, and he also came to support and publicly endorse civil-rights legislation far more extensive than anything Ike would have endorsed, and it was enacted after his death. He and his activist brother, Robert, came to be hated by many white southerners for these words and actions, and some southerners cheered his death. He pushed through the Test Ban Treaty. He started the Peace Corps and through his eloquence and undoubted charisma inspired a generation to enter public service (Hitchens might talk to people like Gary Hart, Chris Dodd, or John Kerry, or any one of the thousands of living Peace Corps alumni, about this quality). Perhaps they were all deluded. If so, the delusion had positive results. His views changed and matured. He was a better president in 1963 than in 1961. The jury is still out on what he would have done in Vietnam and about other matters. However, the liberal evolution of his brothers Robert and Edward after his death may give some indication of where he was headed.

Kennedy deserves the criticism he has received since his death about his sexual recklessness, his Cuba policy, his inflexible attitude toward the Cold War (though that too was arguably changing in 1963), and much else, which stern historians and freelance moralists now delight in cataloging. But any reasoned assessment of him should also take into account the actions and qualities that plunged most of the nation and the world into sincere mourning after his death in November 1963.

Peter Connolly
Washington, D.C.

It’s Getting Hot in Here …

I’m glad Gregg Easterbrook has finally joined the fight against global warming (“Some Convenient Truths,” September Atlantic). But his message would have more credibility if he admitted to his long-standing role in worsening the problem he now decries.

“The only reason runaway global warming seems unstoppable is that we have not yet tried to stop it,” he writes. But a big part of why the United States has waited so long to take action against global warming is that many of its citizens, government officials, and business leaders were convinced over the past fifteen years, by Easterbrook and others, that the scientific case for global warming was at best uncertain and at worst a fantasy promoted by environmentalist Chicken Littles.

It’s nice that Easterbrook has finally seen the light, and I hope he’s right that we can reduce global warming faster and more economically than commonly expected. But our task is much, much harder because we waited these additional fifteen years to get started. If Easterbrook wants to be heard now, he should own up to how he helped get us into this mess. Otherwise, he risks looking less like an optimist than an opportunist.

Mark Hertsgaard
San Francisco, Calif.

No one can seriously argue that the Earth is not warming. Yet Gregg Easterbrook joins the growing number of “catastrophards” who choose to forget what is known from history. According to a chart published by the Intergovernmental Panel of Climate Change, the Earth was warmer than it is now for a period of 300 years during medieval times. At the time of the Magna Carta, 1215, the growing season in England was three weeks longer than it is today, and there was viticulture as far north as Ely. At the same time on southern and southwestern Greenland, crops were grown in sufficient quantity to support a population of about 2,000. Greenland was called “green” because that is what it was.

What caused this warm period? It certainly was not human activity. The catastrophards owe us an explanation as to why today’s picture is different from the Medieval Warm Period, and why it is other than a normal fluctuation of climate, the kind of fluctuation that has previously occurred.

Eugene J. Meyung
Charlottesville, Va.

Gregg Easterbrook ignores the basic science in his assurances that global-warming problems are easily solved. Carbon dioxide—the main greenhouse gas—is fundamentally different from the sulfur and nitrogen pollutants causing smog and acid rain. For smog and acid rain, promising directions in which to look for fixes were obvious, whereas for carbon dioxide, no such promising direction is known. Carbon-based fuels can only produce energy by creating carbon dioxide. No carbon dioxide, no energy!

John C. Miller
Professor Emeritus, Mathematics
CUNY City College
New York, N.Y.

Gregg Easterbrook replies:

Mr. Meyung is right that the Earth has warmed and cooled many times in the past, for reasons not understood. It’s also true that many diseases of the past were caused by natural forces poorly understood, but we don’t say, Therefore we should not fear new diseases today.” Human activity is placing into the air large amounts of heat-trapping gases, and nearly all science academies have in recent years concluded that the current observed climate change is caused by those emissions. The science academies could be wrong, but it is prudent to work from the assumption that they are right.

I caution Professor Miller that I did not suppose global-warming problems are “easily” solved—rather, that solutions may prove less expensive than assumed. In the 1960s, it was said that auto engines simply could not function without expelling smog-forming compounds. Now we know they can, and the control technologies required are affordable. Today engineers don’t know how to build engines or power plants that burn fossil fuels without expelling carbon dioxide. If society creates a profit incentive to discover solutions, we may be pleasantly surprised.

As for Mr. Hertsgaard, I have never written anything suggesting that global warming might be “a fantasy promoted by environmentalist Chicken Littles.” My 1995 book on environmental policy, A Moment on the Earth, devoted a chapter to weighing the arguments of global- warming believers and naysayers, supposed it was impossible to know which side was right, and concluded, “Any reasonable policy that reduces the odds of climate change is more than worth the price.” Fifteen years ago, a thoughtful person looking at global-warming studies might have focused on the uncertainty; at that time the National Academy of Sciences itself emphasized uncertainty. Today a thoughtful person who looks at recent science, including recent National Academy of Sciences statements, must deduce there is a danger. Mr. Hertsgaard’s own work on this subject labors to divide the world into Sinister Conspirators twirling their mustaches and Noble Crusaders crying atop parapets; perhaps Mr. Hertsgaard has trouble grasping that someone can be skeptical, then be gradually persuaded by the evidence.

The Height of Inequality

It will be hard to forget the image, created by Jan Pen and recreated by Clive Crook in “The Height of Inequality” (September Atlantic), of the long parade of tiny low-income workers followed by a few mega-rich giants. Still, I have to point out that this is at its heart just a trick of statistics: Pen is juxtaposing one variable—height—which exhibits a tight variance (its graph would look a lot like the standard “bell curve” distribution of, say, SAT scores, and the numerical average would be about the same as the median), with another variable—income—which has large variance. You could get the same effect using many variables other than income. Imagine, for instance, that the parade walkers’ heights were based on number of lifetime sexual partners (Wilt Chamberlain would be quite a bit taller than even he actually was), or the number of hours each walker has spent piloting small aircraft.

The interesting question is whether a highly skewed distribution of income matters. I happen to think it does, but Crook doesn’t even get to the issue except with the throwaway “How much longer before the dwarves get restless?” line at the end of the article. Some would respond that income distribution has always been skewed, from the time most people were peasants and a few were dukes and kings; moreover, if you arranged this parade at any time in history until very recently, 90 percent or more of the marchers would be fending off their families’ starvation, whereas today you wouldn’t have to get too far into the parade, in the United States at least, to find that most marchers own television sets and air conditioners and are battling obesity instead of starvation. The question, then, is: If the lowest- income workers have been elevated to a certain level, does it matter (beyond some primitive, envy-based response) how big the people at the other end of the parade grow?

To ask the question this way is to answer it, at least for now. Despite historical progress, the first 50 million or so American marchers don’t yet even have health insurance; some percentage of the marchers are still underfed and ill-housed, as are their children. The image of the parade is an argument for redistributive taxation and other economic policies. Only after all of the marchers are provided with some reasonably humane minimum should a few of the marchers be permitted to grow grotesquely tall.

Andrew Stumpff-Kane
East Lansing, Mich.

The Hive

Marshall Poe’s illuminating article on the history of Wikipedia (“The Hive,” September Atlantic) includes a few problems: one of style, one of fact, and (possibly) one of judgment. First, although he explains the origin of many words unfamiliar to the nonspecialized reader, such as Wiki, mud, and gnu, he uses slashdotting without providing the origin of this term, which refers to sudden influxes of new Web-site visitors. (The technology-news Web site www.slashdot.org routinely posts articles about other Web sites and links to them; then when thousands of slashdot readers simultaneously try to link to the subject of the latest article, they sometimes cause the linked site to slow considerably or even crash.) Also, in Poe’s outline of the history of multi-user dungeons, two of the three examples he cites are not multiple-user programs. (Zork and Myst are single-player games, though a sequel to Myst did briefly exist in a multi-user form.)

Finally, Mr. Poe’s admission that he created an article about himself is likely to raise the eyebrows of Wikipedia users, as Wikipedia guidelines frown on this: the Wikipedia:Autobiography article tells would-be contributors, “Creating or editing an article about yourself is strongly discouraged.” Since Mr. Poe says that he has been interested in Wikipedia for about two years, I can only assume that he wrote his essay to inspire discussion, debate, and further research, as any well-written Wikipedia article should, and I applaud his sly experiment in provoking Wikipedian-type behavior in the readers of TheAtlantic.

Kate Foster
Chicago, Ill.

Marshall Poe responds:

Kate Foster is correct: Mist, not the more-famous Myst, was one of the first MUDs, and the original Zork was not a MUD, as it was a single-player game. The first MUD, however, was essentially a multiplayer version of Zork called (appropriately enough) “MUD.” As for my “sly experiment,” Ms. Foster gives me too much credit. When I added the “Marshall Poe” entry, I didn’t know I was running afoul of the rules. I just wanted to see what would happen, and what happened persuaded me to look further into Wikipedia itself.

Upstairs, Downstairs

While we appreciate Sheelah Kolhatkhar’s description of Pinnacle‑ Care services (“Inside the Billionaire Service Industry,” September Atlantic), we must clarify that PinnacleCare advocates are professionals, many with graduate degrees and all with many years of health-care experience. In fact, one of the two advocates Ms. Kolhatkar interviewed received her bachelor’s degree in foreign service from Georgetown University, her M.B.A. from the international business school INSEAD, and another master’s degree from the Health Advocacy Master’s Program at Sarah Lawrence College. The other advocate Ms. Kolhatkar met has been in the health-care field for twenty years and, among her many achievements, has developed innovative, award-winning disease-management programs.

The fact that most of our advocates are women does not mean that they are “motherly” (though compassion is a strong suit), that they “coo” (though they are persuasive and diplomatic with surgeons and hospital administrators alike), or that they extend their “claws” (though they are fiercely dedicated to getting the right health care for our members). The article’s accolades were wonderful, but—as our members would agree—our advocates deserve substantially more respect.

Miles J. Varn, M.D.
Chief Medical Officer, PinnacleCare
Baltimore, Md.

Jump to comments
Presented by
Get Today's Top Stories in Your Inbox (preview)

Is Technology Making Us Better Storytellers?

How have stories changed in the age of social media? The minds behind House of Cards, This American Life, and The Moth discuss.


Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Is Technology Making Us Better Storytellers?

The minds behind House of Cards and The Moth weigh in.

Video

A Short Film That Skewers Hollywood

A studio executive concocts an animated blockbuster. Who cares about the story?

Video

In Online Dating, Everyone's a Little Bit Racist

The co-founder of OKCupid shares findings from his analysis of millions of users' data.

Video

What Is a Sandwich?

We're overthinking sandwiches, so you don't have to.

Video

Let's Talk About Not Smoking

Why does smoking maintain its allure? James Hamblin seeks the wisdom of a cool person.

Writers

Up
Down
More back issues, Sept 1995 to present.

Just In