James Fallows's "Bush's Lost Year" (October Atlantic) contains several gaps and contradictions. I do not understand how Fallows can argue that our invasion of Iraq was a mistake without seriously addressing the belief held by most intelligence agencies that Iraq had weapons of mass destruction. Even if stockpiles have not been found, I've heard no one argue that Saddam Hussein did not have the capability to work on these weapons, especially as UN sanctions enforcement had begun to unravel. No cost-benefit analysis of the Iraq campaign is complete without an assessment of WMD potential. Fallows himself raises this question when he notes that many found the war costly but, based on WMD intelligence, necessary. He leaves the question unanswered, but I would argue that if the war was necessary, then discussing its opportunity costs is an intriguing but pointless game.
Fallows ends his article with a discussion that I believe would have been better placed at the beginning: he writes that the terrorists attack us not because of our values but because of our policies. This charge is so incendiary that it colors everything else. First, I do not see how such an argument can be made without acknowledging the work of Thomas Friedman in The Lexus and the Olive Tree. Western media, entertainment, and business, and most especially women, are a constant threat to Islamic extremists and a constant temptation to the hearts and minds of youths in the Middle East. The mullahs, Friedman argues, see the West as their enemy. Fallows supports his claim that the terrorists are attacking us for our policies by quoting the observation that Norway has not yet been attacked, but earlier he contradicts this argument by noting that terrorist attacks "especially against Americans and Europeans" have increased. Does Fallows seriously contend that terrorism is not a global threat to civilization but merely a reaction to U.S. policy? Following the massacre in Beslan, we are reminded that the terrorists' only logic is the logic of a love affair with death.
Fallows argues that the first casualty of Iraq was Afghanistan, but his view of Afghanistan is confused and distorted. First, he fails to mention the elections then scheduled for October—an achievement of consequence. Second, he assumes that the continued influence of the warlords is a defeat of U.S. objectives, although Afghanistan has always been a collection of tribes. And finally, Fallows suggests that U.S. troops should have forcibly entered Pakistan to cut off avenues of retreat for bin Laden, even at the risk of destabilizing Pakistan's government. Here Fallows grossly underestimates the dangers of unleashing Islamic extremism in Pakistan. There is, of course, the nuclear question; but beyond that Saddam would have been left with a free hand while we got dragged into a quagmire in Afghanistan and Pakistan. Fallows proposes a nightmare scenario: the entire Middle East in chaos, with nuclear weapons loose in Pakistan, and Saddam wielding biological and chemical weapons in Iraq—all for the off chance of nabbing bin Laden in some remote cave. Our troop numbers are and have been low in Afghanistan not to save soldiers for Iraq but because of a deliberate policy choice to keep a low profile. U.S. policy has always been to approach Afghanistan lightly, and this choice has had slow but effective results.
Corey R. McCool
I have read James Fallows's articles and editorial comments for a number of years, and I respect him as one of the more astute observers currently writing on foreign policy and politics. But I take serious issue with the conclusions he has drawn in his October cover story. Or perhaps I should say that I take issue with his major premise, as stated in the sub-headline "How the War on Iraq Undermined the War on Terror." What Fallows apparently doesn't understand—and the media won't tell the American public—is that the war on terror and the war in Iraq are part and parcel of the same effort. Iraq is currently the focal point and principal battleground in the war on terror.
Marine Major William Truax said it best—in a recent letter from his position on the staff of the multinational headquarters in Baghdad—when he observed, "The bad guys did us a huge favor by gathering together in one place and trying to make a stand. It allowed us to focus on them … there rather than hunt them down in their home countries." If Fallows needs any reinforcement of this point, consider that the interim Iraqi Prime Minister, Ayad Allawi, told both the UN and Congress that more than half the terrorists in Iraq—not militants, insurgents, or freedom fighters but terrorists—are from foreign countries, principally Jordan, Syria, Saudi Arabia, Iran, Chechnya, and Afghanistan, and from groups such as the PLO, Hamas, and Hizbollah.
The Jordanian terrorist Abu Musab al-Zarqawi, a high-ranking al-Qaeda operative, is widely acknowledged as the leader of those trying to prevent the democratization of Iraq. The remaining terrorists under his control are the hardcore Baathists and the remnants of Saddam's Republican Guard, which evaporated at the end of major combat with barely a shot fired, only to re-emerge as killers of their own people—as they were in their previous existence. Until the world comes to realize that Iraq is an integral part of the war on terror and is currently its most decisive battleground, we face the very real possibility of abandoning the field to the terrorists, only to be forced to engage them in our own country at some future date.
Richard J. Toner
Brigadier General, USAF (Ret.)
Colorado Springs, Colo.
Our presence in Afghanistan is not nation-building but simply an effort to keep terrorists from using that country for another 9/11. This takes only a relatively small number of intelligence experts, linguists, and boots on the ground. Our much larger commitment to Iraq is a strategic move to check the two countries that pose the greatest threat to the stability of the Middle East and the Islamic world: Iran and Pakistan (not to mention Syria). Troops and bases in Iraq give America more options in dealing with Iran's nuclear program and support of terrorism. Likewise Pakistan, in the event of the fall of the Musharraf government.
I find it disconcerting that The Atlantic would print "Bush's Lost Year" and "The Long Hunt for Osama" in the same issue two months before the general election. Both are biased and are mindlessly critical of the Republican Administration. May we hear from Donald Rumsfeld or Paul Wolfowitz next? Maybe The Atlantic should consider a move out of Boston to a more politically neutral part of the country.
Eugene R. Donaldson
In "Bush's Lost Year," James Fallows goes on at great length but misses the point. The real issue is incompatibility of goals, not whether terrorists are merely irrational or "logical" but brutal. Fallows would have had the President focus his energies not on Iraq but on imposing a land-for-peace deal to solve the Arab-Israeli conflict. But the reason such a deal has eluded so many U.S. Presidents is that the goals of Israelis and their Arab antagonists are incompatible: the Israelis want peace (and Prime Minister Barak offered virtually all the land taken in the 1967 war in hopes of obtaining it), whereas the Arabs are determined to destroy the Jewish state.
In using terror, whether against the World Trade Center, Spanish railroads, Bali nightclubs, or Russian schoolchildren, the perpetrators may well be "logical," given the tremendous disparity in power between them and the countries and institutions they target. So what? The problem is that the underlying goal of Islamic extremists is to re-create a global Islamic empire, a goal involving destruction of the West and the worldwide triumph of Islam over Judaism and Christianity. The extremists must be crushed not because they are "irrational" but because their goals are incompatible with the survival of Western values.
Rael Jean Isaac
James Fallows calls "claptrap" the notion that Islamic terrorists hate us for who we are rather than for what we do. However, in Alan Cullison's "Inside al-Qaeda's Hard Drive" (September Atlantic) no less an authority than Osama bin Laden contradicts him. On April 11, 2001, Osama wrote to Mullah Omar that the United Nations "has become a new religion that is worshipped to the exclusion of God" and "issues documents and statements that openly contradict Islamic belief, such as the International Declaration for Human Rights, considering all religions are equal." Nor did bin Laden limit himself to declarations. He put a prize in gold on the heads of both President Bush and Kofi Annan. Moreover, the UN offices in Iraq were bombed by Islamic terrorists. Bin Laden never condemned this crime. The UN's principles are more important to bin Laden than its policies (the UN is certainly less supportive than the United States of Israel). If this is true for the United Nations, why not for the United States?
Oak Ridge, Tenn.
''Bush's Lost Year" is powerful and thought-provoking. However, James Fallows should have dug one layer deeper in his comments on "misreading the enemy." Although I do not support President Bush's policies, I disagree when Fallows disparages his remark that "they hate us for who we are." By assuming that Islamist terrorism represents only a reaction to American foreign policy in the Muslim world, flawed as it often is, Fallows overlooks a more challenging struggle.
True, al-Qaeda and other Islamist groups frequently cite our support for corrupt, autocratic regimes in nations such as Saudi Arabia and Egypt. They also refer to the Israeli-Palestinian conflict, although Osama bin Laden threw this issue into the mix only after 9/11. And the situation in Iraq seems to have added fuel to the fire. But the root cause of Islamist terrorism lies much deeper.
A battle for control of the Muslim world is taking place. Islamists—as opposed to Muslims in general—seek to force all Muslims into a single theocratic state. Bin Laden himself has publicly expressed his desire to whisk the Muslim world back fourteen centuries to the time of Muhammad, when Islam was—by his and others' definition—purer. Forcing the United States out of Saudi Arabia and the Middle East is not the Islamists' objective. Rather, the withdrawal of American influence is seen as setting the stage for the overthrow of Muslim governments from Algiers to Jakarta. Along the way Israel would be destroyed, since Islamists do not accept the principle of a two-state solution. Then, with control over significant oil reserves, a resurgent caliphate might believe itself ready to confront the West, perhaps first targeting the former Muslim lands of Spain and Portugal. Fantasy? Not according to bin Laden's videotaped and widely broadcast statements.
The Islamists demand justice while continually calling for the deaths of "Crusaders and Jews." This ideological approach, based on the claim of sole religious truth, scorns the compromise and practicality of politics. The Islamist goal lies not in empowering Muslims in free societies but in shackling them within a global Muslim polity defined by the repression of the Taliban and the violence of Nazi Germany.
In sum, "They hate us for who we are" may be the most insightful comment President Bush has ever made, even if he could not or would not fully articulate what underlies it.
San Francisco, Calif.
I have two points about "Bush's Lost Year." First, criticism of the decision to go to war misses a point about decision-making under uncertainty. Second, it isn't Bush's "lost year"—it's Bush's lost three years.
1) In decision-making under uncertainty, one factor—often the overriding factor—is the consequences of possible wrong decisions. In this case there were two ways a decision could have proved wrong:
2) Remember how pumped-up and angry we were after 9/11? We were ready to do something. We were ready to go to war. Bush told us to win the war by going about our daily lives. Bush did not take us to war. He took us to half-war.
I was born a few weeks after Pearl Harbor. This means I remember people discussing World War II as recent history. In World War II the nation was at war. Every American was affected. I am not suggesting that we need to reinstate the draft and widespread rationing. I am suggesting that we should be prepared to accept some difference in our daily lives—if only by paying war taxes and enduring some shortages. Korea and Vietnam were half-wars: unless you knew someone who was fighting, you weren't really involved. We know how those half-wars turned out.
James Fallows replies:
Views about Iraq, Afghanistan, and the Bush Administration's general approach to terrorism are widely divergent and strongly held. My goal in "Bush's Lost Year" was to report the judgments of a number of people with unusually strong standing to speak. Most have been directly involved in planning, executing, and assessing anti-terrorist operations. Few have any active connection to partisan politics, and in their personal politics most are Republicans. Over the past three years most have been proved right in their judgments about the war. They noticed the rush to overinterpret intelligence about Iraq's weapons programs; they foresaw that occupying Iraq would be far harder than conquering it; and they warned ahead of time about policies now generally recognized as mistaken—for instance, tolerating rather than controlling the widespread looting in Baghdad immediately after Americans took charge. I quoted most of these people by name, in this article and in two previous ones. A few, having seen the consequences of directly criticizing the Administration, insisted on anonymity.
Re-arguing their full case would take as much space as the original article—and probably would not change minds unconvinced the first time. But let me address a few themes that run through these letters.
One concerns the long-term motivation of terrorists and the best ways to thwart or control them—a subject I will examine at length in an upcoming article. Of course active, committed members of terrorist cells need to be arrested, stopped, or killed. Of course a prevailing anti-American sentiment in the Islamic world grows in part from envy and resentment of U.S. wealth, power, and success. But to reduce the challenge, as President Bush so often has, to being hated "for who we are" creates a serious strategic liability for the United States. The main harmful consequence is to deter us from something that is indispensable in any struggle: knowing the enemy, which in turn means understanding his strengths, his ambitions, his style of thinking, and his vulnerable points. I agree with David Perlstein that understanding the struggle for control of the Islamic world is an important part of the process. No trace of this has shown up in the President's public utterances. I disagree with Mario Martini's interpretation of the attack on the United Nations. Yes, Osama bin Laden had previously denounced the United Nations. He (or someone sympathetic) launched the devastating bomb assault against UN facilities last year only after the UN had opened an office in Baghdad—and could, therefore, be seen as helping the United States restore order to Iraq. The United Nations was not attacked for what it was; it was attacked because this would strike a blow against American ambitions and plans.
I'd reply in much the same way to Corey McCool. The assertion that Iraq might have had the "capability" to work on nuclear, chemical, or biological weapons is similar to frequent claims by the Administration but is meaningless. Dozens of countries also have this "capability." The Administration made the judgment that Iraq was so overwhelmingly threatening as to force North Korea, Iran, and every other threat to second place. That judgment was questionable at the time and is plainly false in retrospect. McCool wrote without the benefit of seeing the final report from the Bush Administration's head weapons inspector, Charles Duelfer, who concluded in early October that Iraq had no active weapons programs before the war.
I agree with Richard Brandshaft that in the months before the war the United States had to balance two areas of uncertainty: the unknown risk that Saddam Hussein actually had nuclear weapons, and the unknown risks and problems that would be created by starting a war. Unlike him, I thought the second kind of risk was more serious. As long as UN inspectors were inside Iraq, as they were in this crucial period, the risk that Saddam Hussein could do something by surprise was very small. On the other hand, the risks of going to war before the United States had lined up the broadest possible coalition, and worked out the fullest possible postwar plans, seemed very large. Therefore it seemed to me that time was on our side as we waited to attack in the right way. I agree with Brandshaft about the problem with "half-wars."
Like Rael Jean Isaac, I believe that Israel is an important ally of the United States. It is the one democracy in the region; the United States is committed to its secure existence; its citizens need to live free from terrorist attack. I disagree with the implication that it has played no role in the destructive standoff in the region, especially given its settlements policy.
I appreciate Richard Toner's letter but disagree with him on what is often called the "flypaper" theory of America's Iraq strategy: that we can attract the world's terrorists to Iraq and kill them there. This assumes that there are only so many terrorists in the world; the more you kill, the fewer there are. The evidence suggests just the reverse—which is why the report from the International Institute of Strategic Studies this summer was so significant. It found that the worldwide number of active al-Qaeda members had significantly increased since the United States began picking them off in Iraq. In any case, it is hard to imagine this having been presented ahead of time as a reason to go to war: "We want to send troops to Iraq as bait, so they can be shot at and bombed, and we can then kill the attackers." Also, the terrorist Abu Musab al-Zarqawi's relationship to al-Qaeda is generally thought to be looser and more complicated than General Toner asserts.
As for Eugene Donaldson's question, I repeat that the heavy majority of those I quoted were Republicans. I have previously interviewed and provided extensive quotations from Paul Wolfowitz and Douglas Feith. No senior official from the Pentagon or the White House staff accepted my request for an interview for this article. The timing of the story was dictated by unfolding events in Iraq.
I read with interest the letter of Joseph M. Price, M.D., published in the October Atlantic, which makes eloquently and succinctly the same diagnostic points that were in my mind as I read James Fallows's piece "When George Meets John" (July/August Atlantic). President Bush appears to be developing dementia.
As a politically and personally naive young physician I watched Ronald Reagan dementing in office and during his re-election campaign. I saw his obvious developing dementia and was amazed that no one seemed to take public notice of it.
Price is exactly correct in his approach to a diagnosis using the evidence in Fallows's article. Supporting his judgment is correlative evidence from those who have observed Bush's behavior in the White House—for example, his lack of attention to detail in executive meetings and his inability to ask probing questions—and from campaign-trail reports of his confused and sometimes bizarre ad-lib responses in public appearances,
Daniel L. Johnson, M.D.
Joseph Price overlooks the obvious reason for the change in President Bush's debating skills: Bush can no longer speak freely because he is constantly aware that every word he says is scrutinized by opponents at home and abroad for anything that could possibly be turned or twisted to his discredit. The danger of careless speech was brought home with brutal clarity to all politicians when Senator Trent Lott was pilloried for a well-meant, thoughtless compliment paid at a birthday party to an aged friend.
I'm voting Democratic because I'm pro-choice, but let's be fair! For anyone in the President's position, reluctance to debate or to speak off-the-cuff, and hesitation before speaking at all, are indicative not of failing faculties or presenile dementia but of good old-fashioned common sense!
Sonia Bennett Murray
Joseph Price's "diagnosis" of President Bush's more recent speaking difficulties as "presenile dementia" is at once laughable and grossly irresponsible. Physicians often "diagnose" famous people in a half-joking manner as lunchtime chatter. The guesses seldom go further, because the likelihood of error is huge without firsthand medical evidence. James Fallows's article was not a medical chart, and Fallows had the restraint to know it. Price's "putting two and two together" this liberally would make psychiatric patients of us all. In this setting flinging a diagnosis, as if the White House physicians should act now to save the President's dwindling mind, is nothing short of ludicrous.
Shane M. Bezzant, M.D.
San Antonio, Texas
For a doctor to express vitriolic opinion from the pulpit of his profession is unfair and a travesty on the public's trust. Let your readers know that medical doctors by dedication separate their personal feelings from the perceived characteristics of their patients, and most exercise the same restraint when professionally engaging in public matters.
Bill Anderson, M.D.
Laguna Beach, Calif.
I have come to admire James Fallows's ability to see the big picture with respect to any issue, but I thought "The Big Picture," his article with V. V. Ganeshananthan in the October Atlantic, offered anything but. I do understand his preoccupation with private, high-priced "elite" institutions. The Atlantic wishes to appeal to its readers, and admission to such colleges may well be among their concerns. However, as a parent who recognizes that the overwhelming majority of students in the United States attend institutions other than the typical "Top Twenty" that are the focus of the article, I was dismayed to find only one mention of a public university, UC Berkeley, which happens to be the only public university ranked anywhere close to the very top.
Fallows even cited interviews with representatives from private schools (Harvey Mudd, for example) to talk about the problems of public education. Has public higher education sunk so far below the radar that The Atlantic has to resort to that? Fallows also neglected to note that the National Survey of Student Engagement, which he cited several times, was created and is centered at Indiana University, a large public university.
That children of upper-class families view anything other than the Top Twenty as a failure may well show the extent of our broken society in an age when the gap between rich and poor is ever increasing, and educational pedigree can mean more than intellectual accomplishment. However, I doubt that ignoring the value of public education is the best way to solve the problem.
University of Illinois at
James Fallows and V. V. Ganeshananthan offer a lucid, wide-ranging, and insightful portrait of the contemporary undergraduate-admissions process in the United States. The authors have a keen eye for the increasing unpredictability and escalating competitiveness of the process, the gamesmanship behind early-decision applications and merit-aid offers, the occasionally scandalous preferences afforded athletes, and the ensuing frustration that drives almost everyone involved in the process to distraction.
The most important message from this year's survey of college admissions, however, has to do with a subject that is just beginning to enter the public's consciousness, though it has been a topic of growing contention among higher-education leaders and students of public policy. Fallows and Ganeshananthan offer a sobering report on the cascading financial effects of income inequality, rising tuition costs, and declining state and federal aid on the college-participation rates of students from disadvantaged socio-economic backgrounds.
In an otherwise fine piece of reporting the authors mischaracterize one of the primary conclusions that William G. Bowen presented this past April in the Thomas Jefferson Foundation Distinguished Lecture Series at the University of Virginia. Rather than saying that the nation's selective colleges and universities were turning into "bastions of privilege" as opposed to remaining "engines of opportunity," Bowen emphasized the progress that higher-education institutions have made in enhancing opportunities for "poor but worthy" students.
The critical question, as Bowen, Martin Kurzweil, and I note in our forthcoming book, Equity and Excellence in American Higher Education, is whether the nineteen selective institutions we studied have done enough. Our research reveals that students from bottom-quartile families have only one sixth as good a chance of getting into the credible pool of applicants as students from top-quartile families, though those who do make it into this pool have the same chance for admission, at any given SAT level, as any other applicant. In our opinion, equity cannot be achieved through a need-blind approach when students are so academically stratified by socio-economic status in their pre-college years. Although need-blind admission policies represent a step up from practices that explicitly handicap students who require financial aid, such policies still sustain the socio-economic divide by relying on formal academic measures that are themselves strongly correlated with socio-economic status.
We believe that selective institutions should attach a positive weight to both disadvantaged socio-economic status and minority status in their admission decisions, and thereby actively contribute to social mobility in this country. We recognize that much more needs to be done at all levels of government and at all levels of our educational system, but the nation's leading colleges and universities can send a powerful message by taking action now.
Eugene M. Tobin
Andrew W. Mellon Foundation
New York, N.Y.
By interviewing only the insiders of the admissions process, James Fallows and V. V. Ganeshananthan allowed ranking-induced irrationality to find its way into one of their conclusions. Citing complaints from "other colleges" about "aggressive prospecting" for applicants and an "attract to reject" strategy, they concluded that Washington University in St. Louis achieved Top Ten ranking "largely by attracting more and more applicants." If this were true, we would expect Washington University's selectivity to be disproportionate to other indicators of quality, including graduation and retention rates, faculty resources, class sizes, student-faculty ratio, SAT and ACT scores, financial resources, and alumni giving. This is not the case. When Wash U tied Dartmouth for ninth place in 2003, its selectivity rank was ninth (four-way tie) and its acceptance rate was twelfth to fifteenth lowest (four-way tie). If Wash U application rates have risen, this has only brought them into line with other indicators of true quality and value. If Ivies and other top schools recruit less actively, could it be that their admissions departments are able to rest on the laurels of their brand-name status and watch a steady parade of top students from around the country come knocking on their doors?
If the authors had interviewed students and parents, this is the kind of information they could have added to their analysis. Our daughter visited fourteen colleges and applied to six: three "Gotta-Get-Ins" and three "well-regarded non-elite" colleges. She felt no "aggressive prospecting" from any of them. No glossy brochures arrived from the Ivies (not even for a National Merit Scholar and Siemens-Westinghouse semifinalist). They counted on us to know who they were and how to find them, which we did (two visits, no applications). Our own Northeast biases were happily shattered by many excellent colleges in other regions. Details about Wash U (and some other colleges) were first learned by word of mouth from high school upperclassmen, who chose Wash U and talked about it back home. Now attending Wash U herself, our daughter describes its institutional attitude as "We're glad you're here, and please let us know what we can do to make your experience better."
Your superb coverage of the admissions process omits a significant factor: the loss of endowment revenue in the stock-market bust. Many colleges cannot afford to accept disadvantaged students during these difficult times. Furthermore, the rising medical and retirement liabilities of faculty and staff make tuition decreases almost impossible.
Meanwhile, reliance on the SAT or the ACT poses two problems. These tests give students a poor gauge of worth. Good scores create a pride that may or may not be well placed. Until students face their peers in class, until they face others of like intelligence, they may falsely assume that their potential will carry the day. Written by educators, and timed, the tests fail to evaluate imagination, wit, grace, and determination. These qualities may emerge during the college years or may exist already, but I doubt that admissions officers have time to see them given the volume of applications they must process. Yet the numbers give the admissions officer a simple method of eliminating many applicants.
San Luis Obispo, Calif.
The fact that decisions about admission to top colleges "often seem inexplicable or based on whether a student has forged a personal connection with an admissions officer" illustrates what is wrong with the "judge the whole person" ideology that is so much in vogue nowadays. Though it appears humane, this ideology robs applicants of their privacy by opening all their activities to scrutiny, and it puts applicants at the mercy of admissions officers' likes and dislikes. Colleges would do far better to admit students on purely academic grounds, choosing at random among equally qualified applicants and providing for affirmative action by lowering the academic cutoff point for under-represented racial minorities and the poor.
I agree with Gregg Easterbrook's article "Who Needs Harvard?" (October Atlantic). In 2000, having been rejected in my bid for a spot at one of the Ivies, I went to Rice. It's among the best decisions I've ever made. I graduated this May not only enjoying history more than when I matriculated but also with a scholarship to pursue my dream of studying international relations overseas, at the Australian National University. A college education is less about where you go and more about what you make of it.
Friends of the University of Rochester are disappointed! This great small research university is mentioned in passing by Gregg Easterbrook as among the "estimable" colleges (though it actually belongs in the "slightly less good than the elites" category). But illustrations of its campus are used throughout the College Admissions section with nary a mention. Surely if its campus can be made emblematic of the American college, its other virtues deserve praise as well.
Gregg Easterbrook rightly dismisses the premise that an education of substance can be found only at one of the fine institutions noted in his "Gotta-Get-In" list. But Easterbrook is mistaken in his reference to a Rhodes scholar this year from Hobart College. Two Hobart alumni have been Rhodes scholars: Ralph C. Willard, class of 1904, and Emerson Spies, class of 1936. However, Julia James, of the William Smith College class of 2004, is the first William Smith woman to earn this prestigious honor.
Hobart and William Smith are coordinate liberal arts colleges that share a campus, a faculty, and a president but have separate deans, diplomas, student governments, and athletic programs.
Mark D. Gearan
Hobart and William Smith Colleges
It's about time colleges finally got serious about measuring how much their students have learned ("Measure by Measure," October Atlantic). For far too long they've taken undue credit for what their students bring to class in the form of socio-economic status and inherited ability, rather than for what they are taught by their professors. Unfortunately, answers won't come from the three-hour Collegiate Learning Assessment. That's because assessment is more complex than merely asserting that a test measures how well students have learned to think, not just the particular facts they've memorized. The CLA's makers point out that college seniors had significantly higher CLA scores than freshmen with comparable SAT scores; but colleges need to demonstrate more directly that the skills evaluated on the CLA in entering freshmen and then again in graduating seniors were developed by classroom instruction or campus-related activities.
Los Angeles, Calif.
It's a rite of passage for upper-middle-class parents to become apoplectic about the college-admissions process, but what makes them go comatose when it comes to thinking about the kind of citizen that comes out the other side? Admission is not the end, and I appreciated Richard Freeland's points on the blend of practical and traditional education ("The Third Way," October Atlantic).
With three kids who are either in or just out of college (and twelve SAT exams, five prep courses, twenty-one college applications, nine AP exams, and more than thirty college visits behind me), I feel particularly qualified to comment on how much value I've gotten for my dollar. At the risk of annoying my children, I've found that college education (even at the most selective colleges) may be one of the worst value propositions around.
A few suggestions from a $300,000-plus-tuition-paying mom:
1) Create colleges that come in different flavors. Why does every school boast a ten-to-one student-teacher ratio and show happy kids under autumn leaves (even when the school is located in the high desert)? With a slight nod to selectivity and the pedigree of your kid's peer group, once you're in, course offerings and teaching styles are pretty much the same. On average my kids had only one professor a year they truly enjoyed; the others were satisfactory at best. We need more variation in colleges (look at Warren Wilson and Reed College as two good examples), and we need teachers who can engage and inspire their students.
2) Lose the country-club motif. I don't mind my kids' having to learn to clean a toilet, plant a shrub, or paint their dorm (especially if it'll keep tuition costs down). Telling them that they can study while others pick up their garbage is something I'd never do at home. It's misguided, and it makes for rotten citizens.
3) Use the Myers-Briggs test or some equivalent. As parents we've raised our children to believe, to a fault, that they can grow up to be anything. At some point they have to realistically narrow down the list. Personality tests are not the holy grail, but they do offer a data point for pseudo-polymaths.
4) Make advisers earn their keep. College advisers seem a lot like school-dance chaperones: nice idea, but useless. They get involved too late in the process to be much help. They seem to help pick a major by counting up the students' credits. They don't make helpful recommendations like "Take statistics or learn HTML if you're thinking about a poli-sci major," and they seldom plan for things like junior year abroad. Worst of all, they have only minimal contact with options in the outside world. Professors' pay should be related to their success as advisers.
5) Provide more real-world contact. Internships, community outreach, neighborhood partnerships—if you're a studious reader of bulletin boards on campus, you may score an opportunity for such experience; but it should be made a more formal part of the system.
I suggest that The Atlantic differentiate its coverage of the college process by looking at the other end of the pipe next year. It's equally frightening.
New York, N.Y.
Richard M. Freeland replies:
Robin Raskin expresses the frustration of many parents about the perceived lack of attention by many colleges to the practical challenges of life. Her cri de coeur helps us understand why practice-oriented education, which combines liberal education, professional education, and workplace experience, is proving attractive to students and compelling for educators. There is abundant evidence that college holds the key to economic security, but students shouldn't have to choose between liberal education and professional studies.
It seems extraordinary to me that none of the five essays on college admissions in the October Atlantic mention (much less discuss) the senior military colleges and universities (for example, Norwich University and Virginia Military Institute) or the national service academies. Given that the Defense Department and the Pentagon control such a large percentage of the federal budget, and require a source of patriotic, intelligent, broadly educated, thoughtful, and well-disciplined people, these institutions supply the United States with a disproportionate number of those making decisions about defense, national security, foreign policy, and the expenditure of taxpayer dollars.
Your readers deserve to know how potentially influential people are selected by the military colleges and national service academies, and about the nature, breadth, and quality of the education students at those institutions receive.
Robert W. Christie
Until I read Michelle Cottle's "Brief Lives" bio of Teresa Heinz Kerry ("The X Factor," October Atlantic), I hadn't really "considered" her. I knew, of course, that she was John Kerry's wife, and that she had inherited a soup fortune from her first husband, but I hadn't really thought about her.
Cottle's article changed all that. I now think of Teresa (a name that Cottle gleefully points out is "pronounced Tuh-ray-za") as a woman who, if she becomes First Lady, has a more than excellent chance of being as much maligned as Eleanor Roosevelt—perhaps even more.
Teresa is "eye-poppingly rich," "mouthy," "outspoken," spunky, saucy; has an "international" outlook; and, God forbid, might even turn out to be a "co-President"! And, if you can imagine it, she had the nerve to address the Democratic convention in five languages. Five!
Yet for all her flamboyance, she "purrs" (in her "accented and strangely phrased English") when she explains the conception the political world has of what's acceptable from wives of politicians. And that voice! It's "low, throaty … in a sexy, Sophia Loren kind of way … even when she's discussing what it's like to be labeled 'outspoken,' 'opinionated,' 'tough,' and even 'crazy' by the chattering classes." Why, she can even "swish her hips and just take off to the music"!
Even worse, she restructured her first husband's foundation to use the money for liberal causes—"improving early-childhood education, expanding drug coverage for seniors, and helping cities pursue environmentally friendly development."
This is indeed a woman to watch! I hope the next time I go into a Wendy's (I love their Frosties), I just might be privileged enough to see Teresa, that "too glamorous … exotic bird" in her "movie-star sunglasses," reach down to "ruffle the dark hair of a small boy perched in a booster chair."
To Michelle Cottle, she "resembles nothing so much as a debutante at a tractor pull." To me, she resembles a fair wind in a very stormy sea.
Chuck Berry's 1958 classic, now unfortunately used at John Kerry's campaign events, is titled "Johnny B. Goode," not "Johnny Be Good." An easy mistake, for sure, but an easily avoided one as well.
Nathan S. Carruth
Fort Worth, Texas
The catty swipes at Teresa Heinz Kerry by Michelle Cottle were irritating and unnecessary, and said far more about the author's character than Ms. Kerry's. "She resembles nothing so much as a debutante at a tractor pull" is clearly a personal, snide, and arrogant swipe at a woman not given to fitting into some shallow, politically correct persona. And in context, telling the "impertinent reporter" to "shove it" was appropriate and refreshing.
Finally, why should an intelligent person pretend to be fascinated after hearing a speech "a thousand times"? Is honesty such a liability in our political discourse?
Thomas E. Larson
Here is some of what we know about Antonin Scalia, the author of the Blakely opinion analyzed in Benjamin Wittes's excellent article "Suspended Sentencing" (October Atlantic):
1) He is sixty-eight years old.
2) Recent news photos reveal that he's considerably overweight, even obese.
3) He has a temperament that would once have been called choleric and is now called type A.
4) He eats fatty meats, such as duck, as he revealed after his infamous hunting expedition with the Vice President.
5) His public behavior has become notably odd in the past year: witness the hunting trip, topped by his bizarre, rambling twenty-page defense of it. His speeches have grown increasingly intemperate, to the point where even he agrees he cannot sit on certain cases.
6) He is the author of what Wittes justifiably terms "the single most irresponsible decision in the modern history of the Supreme Court." It is hard to see any remaining trace of the Antonin Scalia of the 1980s, respected for his intellect even by his ideological opponents, in the nearly incoherent Blakely opinion.
In his invaluable book Leaving the Bench: Supreme Court Justices at the End, David N. Atkinson paints sobering portraits of elderly Supreme Court justices who remained on the bench even as they suffered mental impairment resulting from strokes. It has happened repeatedly, with Justices Grier, Chase, Field, and William O. Douglas, among others. There is no mechanism for removing intellectually incapacitated justices from what has recently proved itself the most powerful political body in our system of government. We as a nation must face the possibility that history is repeating itself.
Albuquerque, N. Mex.
Benjamin Wittes's article about Blakely v. Washington is off the mark on two key points.
First, Wittes argues that Blakely deprives political institutions of their rightful authority to meaningfully guide judicial discretion in handing down sentences. But the legislature never had that power in the first place. Pursuant to Article III of the Constitution, the power to impose criminal punishment in any specific case is vested in the courts. Thus the sentencing guidelines (which Wittes correctly predicts are not likely to survive Blakely) are an example of legislative usurpation of the power of the judiciary. Blakely is only an attempt to restore the constitutional status quo.
Also, Wittes overestimates the potential impact of Blakely on the fairness of criminal trials. If the prosecution were forced to prove aggravating factors beyond a reasonable doubt before a jury, the guilt and penalty phases of the trial could be separated to ensure fairness. This is already a constitutionally mandated feature of capital trials.
San Francisco, Calif.
I found Benjamin Wittes's article very informative but ultimately mystifying. Wittes is right to point out the decision's sweeping and unpredictable impact, but his harsh criticism seems odd. I understand the phrase "maximum sentence" to mean the longest sentence you can serve if convicted of a particular crime. Apparently, in the past judges could add years to sentences on the basis of "facts" that had never been proved in a court of law. I am a historian, not a lawyer, so maybe I'm missing something, but I am shocked to think that if I were convicted of a crime, the judge could sentence me to prison for much longer than the maximum sentence for the crime of which I was convicted. What can be the objection to remedying this flagrant violation of constitutional protection? Wittes suggests that sentencing guidelines are "salvageable" if judges treat maximum sentences as maximum sentences. Well, yes—they should have been doing that all along!
As a historian I was also puzzled by Wittes's conclusion, in which he compared Blakely unfavorably with Brown v. Board of Education. Granted, we should all be grateful that the Court ended segregation in schools, but the requirement that schools desegregate with "all deliberate speed" is hardly "crystal-clear guidance." The phrase is an oxymoron, and it was seen as such at the time of the ruling. It was hopelessly vague, not crystal-clear—and that was one of the reasons school desegregation took so long.
Finally, I agree that the principle that separate education is inherently unequal is "clear" and "morally compelling"; but so is the principle that any "fact" that will deprive an American of freedom should be proved in a court of law.
Hyman Rubin III
Benjamin Wittes says, "Roe, whether you love it or hate it, affected only abortion policy." That decision, however, has affected a whole host of thoughts, laws, and maybe rulings about the rights of individuals over their own persons. I suspect it is one of the broader decisions made by the Court in many a year.
La Paz, Mexico
Benjamin Wittes replies:
Gene Vorobyov is correct that the power to impose a sentence is traditionally a judicial function and that the sentencing guidelines depart from a long pre-existing norm of allowing judges unfettered discretion within an often broad statutory sentencing range. But it does not follow, as he suggests, that this norm is constitutionally required. What, exactly, is wrong with a legislature's attempting to ensure some uniformity in sentencing by outlining and weighting the factors that should guide a judge's discretion? In other words, if it's constitutional for a law to make robbery punishable by anything from probation to twenty years in prison based on a judge's whim (as it certainly is), then why is it not constitutional for the law to say that a convicted robber's sentence within that range shall be based on an identified set of facts proved to that same judge's satisfaction?
Mr. Vorobyov is also correct that one could solve the problem the Supreme Court identified in Blakely by bifurcating all criminal trials, the way capital cases are now split between a guilt phase and a penalty phase. Doing so would also address my concern about the possibility of biasing juries by mentioning all the aggravating factors that might justify a harsher sentence. It would, however, be a very expensive solution to a problem that did not exist until the Court made it up.
Let me try to clear up Hyman Rubin's mystification. Nobody argues that if the maximum sentence for a crime is ten years, a judge can simply tack on an extra five on his own authority. But defining the statutory maximum sentence for constitutional purposes is a tricky business. What if a statute says that arson is punishable by ten years unless a judge finds a racial motivation for the crime, in which case a defendant can get twenty years? Is the statutory maximum then ten years or twenty years? Should the fact of the racial motivation be regarded as an element of the crime (which must be proved to a jury) or merely as a factor in sentencing (which need only be proved to a judge)? Traditionally, the courts have deferred to legislatures on this question. In Blakely the Supreme Court made itself the arbiter.
I read Sandra Tsing Loh's article "A Gloom of One's Own" (October Atlantic) with interest. It made me look forward to the day when female sexism is scrutinized as closely as male sexism.
Aren't the women of The Bitch in the House really enraged because their husbands will not do what their wives tell them to do? Would the women be willing to submit blandly to their husbands' orders? The comment "Just once I'd like to walk into the house and dinner is made!" is worthy of Archie Bunker.
Are these husbands really making no contributions to the household? Or are their contributions unrecognized and undervalued by their wives, much as the contributions of housewives were undervalued by their husbands in the past? The men's activities are not unimportant merely because the women consider them so, and I suspect the women and men find an equal amount of couch time.
My wife and I have been married for twenty-five years. We are not bitch and bastard; we are just people working to build a household as best we can. I suspect most couples fall into the same category.
Steven Troy Mitchell
"A Gloom of One's Own" is delightful. At about sixty I have looked after myself for decades, taking care of my cars and my clothes and my kitchen and everything else. Getting things done is as simple as setting priorities, organizing your efforts, and working efficiently—and men are better at all that than women are (that smaller male corpus callosum allows for more left-brain dominance when needed, don't you know).
Palo Alto, Calif.
I can barely describe the joyful satisfaction I felt after taking a minute (or thirty) to recline on my couch and read "A Gloom of One's Own." It had been a long time since I took time out for myself, and it was well worth it. Not only did I feel compelled to write this letter to the editor, but I wrote a poem as well!
Hurrah for Sandra Tsing Loh for finally asking what I feel is the question for highly stressed, career-driven women: What's so great about work anyway? I mean, is editing a magazine that's largely about perfume and makeup really more existentially meaningful than raising a solid individual who also happens to be one's own child? Why?
Almost no one I know truly loves his or her job. My doctor friends dream of early retirement. My lawyer friends are utterly bored. Professors find that the satisfactions of research and teaching are eclipsed by departmental infighting and the drudgery of committee work. I've had a series of marketing-related jobs, and always tire of them after the first six months. Once a job is mastered, it's not interesting anymore, and none of us can believe we spent all those years getting educated for only this.
Women are now free to be as miserable as our fathers were, and our cheerful embrace of this alleged freedom, our unexamined willingness to sling our children into day care to help those couch-lounging bastards bring home the bacon … well, how Stepfordian is that, really?
I haven't read a magazine article for the past seven years, having become the mother of a beautiful daughter. But I was lucky enough to come across Sandra Tsing Loh's article when my stay-at-home husband left the magazine on the coffee table and I had ten minutes to myself.
I've been through the nanny phase, in which I made the bulk of the money for my household, and the work-from-home phase, in which I didn't (much to my husband's dislike). The work-from-home-and-care-for-the-child phase almost drove me crazy.
This year I traded places with my husband. I took an office job for the first time in four years; he left his job to care for our daughter over the summer and, supposedly, to write more music. What happened was that he found he had no time for music (hmmm). He enjoyed staying home with our daughter. And he developed a dinner repertoire of turkey burgers, salad from a bag, and Trader Joe's frozen fries.
I work in an office of twentysomething men and recently found myself giving an engaged man the following advice:
1) It's a myth that women can have it all.
2) Don't think that your wife is going to want to work full time when you have children, even if now she insists she will.
3) Have children before you're thirty. Otherwise you'll be too exhausted to enjoy them.
After I said that, I retreated back to my computer screen, wondering if I sounded too bitter. And then I realized why.
A few years ago, when I had to let go of the nanny, I took my then four-year-old daughter to the beach with my cell phone so I could do a little work while I watched her romp in the waves. While the sun was burning off the fog, and the water reached across the sand for our toes, my daughter turned to me and said, "Mommy, before you were a Queen Mommy." "And now?" I asked her. "Well, now you're a real mommy."
Thanks again for helping me to realize that I'm not alone. Tomorrow I'm going to ask for flextime.
Los Angeles, Calif.
Kudos and gratitude to Sandra Tsing Loh for getting so much right in her thoughtful, smart, and oft hilarious essay about the books my husband and I edited, The Bitch in the House and The Bastard on the Couch.
I must, though, take issue with one of her points, because she's not the first to make it; in fact, I'm growing rather weary of those whose solution to stressed-working-mother syndrome is that mothers of small children simply stop working. "Is it really such a great loss to cut back on the writing for a few years and just … get those kids into kindergarten?" she laments. "After all, not all prose is deathless; some of it is terminally ill."
It's a nice thought (I myself wouldn't have minded taking a few years to lie on the couch nursing, napping when the babies did), but like the vast majority of Bitch contributors, not to mention mothers in general in this country, I didn't have that option. Dan and I both work six or seven days a week and most evenings (albeit with time factored in to care for the kids when they're not in school) to squeak by in roughly the same lifestyle in which we were raised (moderate home, public school), both of us by mothers who stayed home when the kids (four, in my mother's case) were young. True, we're writers, whereas my father is a physician and Dan's is a professor, but today most professors and even some physicians, not to mention your average accountant, real-estate agent, editor, insurance rep, teacher, plumber, cop, politician, bank employee, or franchise owner, would be hard pressed to raise a family decently without an additional income. That's one reason why 72 percent of mothers in this country work, 30 percent of them out-earning their husbands.
Tsing Loh suggests that we "slow down to the pace of Ellen Gilchrist," who writes, "One of the reasons I am happy now is that I did the work I had always dreamed of doing. But I didn't start doing it seriously and professionally until I was forty years old … I was too busy falling in love … and having babies and buying clothes and getting my hair fixed and running in the park and playing tennis."
It's a nice life if you can get it, but most of us can't. So can we please stop suggesting it as some sort of viable alternative to a mother's making a living?
Sandra Tsing Loh replies:
Anyone who can do the math knows that writing—unless it's in a proven financially viable field like Internet porn—is not a particularly sensible economic choice. (Indeed, as a purveyor of such country-gentleman arts as public-radio commentary, short stories, and the occasional novel, I have—like some errant soybean farmer—often brought greater economic benefit to my family when I've stopped producing.) Things like creative freedom, choice of projects, and extended bouts of flagrant self-expression are expensive—yet these are luxuries that oft published female writers like Hanauer and me clearly enjoy, even as we struggle to diaper our children and all the rest. Which is why I think Hanauer's suggestion that most of the Bitches somehow had their economic hands tied behind their backs is disingenuous and possibly even insulting to the teachers, plumbers, and bank employees she speaks of so passionately. After all, unlike several author friends I have on both coasts, bank employees rarely sour the moods of their fellow men with Camille-like complaints about the hectic publicity schedule of (and possible airport allergens encountered during) their seventeen-city Knopf-funded book tours. As to which women have the tougher gig, the stay-at-homes or the draw-the-paychecks, the Gilchrist passage Hanauer mentions also went on to describe Gilchrist's writing up PTA minutes and plays for the dinner parties of her husband's law firm. If that dutiful wife's work doesn't sound like a grind, I don't know what does. And yet unlike so many rageful women of today, Gilchrist had the elegance not to blame society for her domestic situation. So in short, Cathi, I disagree with you utterly! But I still respect you deeply, you and Dan! Call me! We can work this out! Maybe in a new essay collection!
I read with interest the small feature "A Nation of (German) Immigrants" ("Primary Sources," October Atlantic). I was dismayed, however, to see that Hawaii is not included in your map of the United States, although its statistics are presumably included in the relevant U.S. Census Bureau report and its population is greater than that of eight other states plus the District of Columbia, according to the 2000 census. Nor is Alaska, the population of which is greater than that of Vermont and Wyoming, represented.
Turning the page to "Calling All Nations," I see that Hawaii is included in the map but mysteriously not shaded in the same color as the U.S. mainland or Alaska. I cannot imagine the reason for this additional oversight.
James J. Nelson
James Nelson is correct: Alaska and Hawaii were missing from the "Nation of Immigrants" map, and Hawaii was shaded incorrectly in "Calling All Nations." We regret the oversight.
In "Who Will Be the Next President?" (September Atlantic), Nathan Littlefield writes that in 1996 Bill Clinton received 54.7 percent of the vote and Bob Dole 45.3 percent. In fact Clinton received slightly more than 49 percent of the vote in that election. Your statistician apparently forgot the 7.8 million votes Ross Perot drew. You may verify the correct statistics at www.search.eb.com/elections/etable3.html.
Bill Jaaskelainen Jr.
Nathan Littlefield replies:
The percentages in the article were not meant to show each candidate's share of the popular vote as a whole; instead they show his share of the two-party popular vote. This is the outcome on which participants in the Iowa Electronic Markets are "betting," and it usually corresponds to the winner of the election. The markets did not include a contract for Perot in 1996. I should have mentioned these facts in the article.
That said, Bill Jaaskelainen's numbers are correct for the overall 1996 popular vote. He might also have noted that with George W. Bush garnering only 48 percent of the popular vote in 2000, we have not had a President elected with a popular majority since 1988.
October's "A Look Back: 125 Years Ago in The Atlantic" features an editorial purportedly by William Vaughn Moody. But Moody was born in 1869, so he could not have been the author of that 1879 editorial. The columnist is more likely to have been Winfield Scott Moody, a journalist and newspaper editor who would have been twenty-three in 1879.
Susan G. Larkin
Thanks to Susan Larkin for the good catch. In fact, as it turns out, the columnist was neither William Vaughn Moody nor Winfield Scott Moody but William Goodwin Moody.
I respect Peter Bergen and the reputation of The Atlantic ("The Long Hunt for Osama," October issue), but I think bin Laden is dead. As far as I can tell, no one credible has seen him since October of 2001, when he modestly accepted accolades aired by al-Jazeera for the 9/11 attacks on us infidels. The videos seen since were all sans audio and could have been made before then. The audios released have been deemed legitimate by our government specialists, but since bin Laden allegedly has twenty children, his survivors might have been able to find a grown son with a lot of his father's genes, raised in an area that would produce the same accent, to make the scratchy recordings released to us. I also think al-Qaeda has been seriously weakened, mostly by our assiduous efforts to block its sources of money. Keeping bin Laden "alive," if he is in fact dead, would be a logical strategy for those hoping to keep the movement alive. Perhaps we will soon hear of a miraculous "resurrection"—something that had some success in another religious movement.
In his review of Margaret Mauldon's translation of Gustave Flaubert's great novel Madame Bovary ("No Way, Madame Bovary," October Atlantic), Clive James praises the "physically handsome" format of the book but chides the editors at Oxford for their choice of anachronistic cover art: James Tissot's Young Woman in a Boat, done in 1870, thirteen years after the publication of Madame Bovary. To some this will seem a pedantic cavil, but as James rightly observes, by simply glancing at the cover careful readers may begin to suspect that "in the matter of historical fidelity things are out of kilter" in the text itself—a suspicion that he finds solidly confirmed in this case.
Ironically, James's review sports an illustration of a fashionable young lady whose clothing and hairstyle—even the chair on which she is seated—all place her firmly in the Regency period, some forty years before the publication of Flaubert's masterpiece. The most egregious example I have seen of a discrepancy between cover art and the historical setting of the novel it graces is the 1995 Norton Critical Edition of Ford Madox Ford's The Good Soldier, a work whose literary merit is regarded by many critics as equal to that of Madame Bovary. Ford's novel is set in the years immediately preceding World War I, yet Norton inexplicably chose for its cover a portrait by Rubens, Young Woman in a Straw Hat, from around 1625. Why, I found myself asking, would someone charged with selecting the art for a withering dissection of Edwardian society overlook works by Sargent, Eakins, or Whistler, or one of the many talented but lesser-known society painters of that era? That both these gaffes were made by respected publishers leads one to conclude that historical fidelity is becoming expendable even among those whose credibility would seem to depend on maintaining it.
Gary L. Kriewald
Corby Kummer's breathless tribute to Alice Waters ("Good-bye, Cryovac," October Atlantic) struck a chord with us. My wife and I attended different elementary schools in Los Angeles in the 1930s; both of them featured vegetable gardens that we pupils prepared, planted, and tended. At harvest time we took home a dividend of fresh veggies, and in her case held a "farmers' market" sale of the bounty. The gardens were used in a variety of instructional ways, from teaching about agriculture to illustrating plant growth to helping with arithmetic.
By the time our children began elementary school in Los Angeles, in the 1950s, the gardens were gone, paved over like the rest of the schoolyard. In our day, school playgrounds were mostly covered with gravel. Skinned knees were endemic, and maintenance, I suppose, was costly. The solution was to create asphalt jungles, where the lines for softball, dodge ball, basketball, and volleyball could be painted on without having to be replaced every few days. Unfortunately, the gardens were eliminated along with the gravel.
Our youngest daughter teaches third grade in Woodinville, Washington. Now inspired, perhaps we can persuade her to campaign for a garden at her school.
Richard H. Hill
Although I agree with much of Jona- than Rauch's thesis regarding the need for yin and yang in government ("Divided We Stand," October Atlantic), it should be noted that George W. Bush's success as governor of Texas, and his reputation as a "uniter," were due to the vast experience and political skill of his lieutenant governor, Bill Hobby, a Democrat and a legend in Texas. While Hobby basically ran the state government and successfully worked out legislative deals, Bush, as front man, spent most of his time having harmless, congenial chats with various lawmakers and signing not-so-harmless execution orders. Let's give credit where credit is due.
In national office Bush has been exposed as the featherweight he is: a man with little knowledge of how the world works. When he fell into the clutches of his new nannies Cheney, Rumsfeld, et al., the neocon winds easily blew Bush's positions on critical issues to the right—where he seems quite comfortable anyway, and where the primary goal is not passing legislation but amassing raw power and retaining it.
Carol de Lamadrid
I enjoyed Alexandra Starr's article on the new kind of Democrat emerging in the South, employing competence, wit, and southern charm ("Dixie Chicks," September Atlantic). I was disappointed, however, to read Starr's swipe at Hillary Clinton. Clinton was elected senator by the people of New York, a diverse citizenry who might not cotton to the southern charm Starr describes in her piece. But when Clinton comes to the Hudson Valley, it's standing room only.
Why the jab because her wardrobe includes pantsuits, as if her choice of clothes made her and other similarly attired women less feminine? Or the implication that her liberalism is a negative? She fights for the people, and if that's liberal, more power to her. What a shame to contribute to the false notion that women who are competent, aggressive, and ambitious aren't feminine unless they follow outdated rules. The "Dixie Chicks" may have to abide by those notions to make it in the South, but it's a shame that the perception of femininity is an issue at all. We don't seem to define men by degrees of masculinity, although an overtly feminine man might not pass muster in some circles.
People who deal personally with Senator Clinton find her womanly, feminine, and, yes, even charming. She's an excellent senator, and we are lucky to have her, pantsuits and all.
In "The Hollywood Campaign" (September Atlantic), Eric Alterman puzzles over why "so many people in the media find it less objectionable … for the CEO of General Motors to lobby for relaxed auto-emission standards than for an actor or a director to contribute to a campaign for clean air." Leaving aside the question of how many people actually object to clean-air campaigns, I think I can explain the general phenomenon.
Whereas auto executives engaged in lobbying are rightly perceived to be operating from self-interest, it can also be assumed that theirs is an informed self-interest. No such assumption, however, can be made when a person famous for singing, acting, or just outstanding physical attractiveness uses that fame to influence economic, legal, or policy issues. Though they like to flatter themselves by posing as political dissidents, it is not the causes celebrities choose to endorse that most people, including those in the media, find maddening. Rather, it is celebrities who presume to speak publicly on matters they appear to know so little about, combined with the arrogance of their granting themselves permission to do so and an inconsistency between word and deed that is too childish to be called hypocritical.
Todd D. Clark
New Orleans, La.
Corby Kummer ("Principled Pork," September Atlantic) failed to mention that Niman Ranch products are available in local markets at prices lower than those quoted on the Web site he references. After drooling through his article, I visited the Niman site, scanned the tasty-sounding recipes, and ordered two pork rib chops ($24.00), two seven-ounce beef filets ($44.00), and a twelve-ounce package of applewood-smoked bacon ($8.00). The products arrived by overnight delivery in good condition. The bacon—in fact, all the meat—was as flavorful as Kummer had promised. But I later discovered the same bacon at Trader Joe's for $4.69 and the same pork-chop package for $6.79.
Huntington Beach, Calif.
I was pleased to see your piece on the world's most athletic nations ("Olympic Elite," July/August Atlantic). I have long been annoyed by the medal standings that are on constant display in the media during the Olympics and afterward. What is the point of comparing the U.S. medal count to that of, say, Eritrea, an impoverished nation of four million? Your article gets on the right track by bringing population into the picture. But it takes more than a large population to produce medals. It takes money to pay for coaches, facilities, and living support for athletes. So a better measure would be to divide a country's per capita gross domestic product by the total number of medals it won. If you use this calculation for the Olympics just finished, the top "winners" are, in order, China, Russia, Ethiopia, Cuba, Kenya, North Korea, Ukraine, Uzbekistan, and Romania. The United States is eleventh. Of course, other cultural factors come into play that are difficult to quantify, such as government support and overall participation and interest in sports. But this basic analysis is eye-opening.
Vancouver, British Columbia
I enjoyed reading Ian Frazier's humorous article "If Memory Doesn't Serve" (October Atlantic). Frazier is mistaken, however—or, perhaps, making a subtle joke on the theme of his article—when he says that Victor Klemperer, whom he tends to confuse with his "cousin" Werner, kept a detailed "journal of his days in Berlin during World War II." Klemperer (Victor, that is), whose published diaries precede and go beyond the actual war years, was living in Dresden, where he was a professor of Romance languages at the local technical university (until he was removed from his post by the Nazis), and where he managed to survive the Third Reich as one of the city's few Jews. Maybe Frazier mixed him up with his cousin Otto, who was a classical conductor in Berlin in the early 1930s before he emigrated to the States. Werner Klemperer was actually Otto's son and therefore Victor's nephew.
Owing to a series of editorial miscommunications, Heather Thomas's letter about my September article "The Hollywood Campaign" was published in the November Atlantic without my having seen it in advance. This is unfortunate, as Ms. Thomas's confusion about a number of matters manifests itself in a series of false assertions, based apparently on imaginary conversations.
Although Ms. Thomas professes to be able to discern my thoughts and my intentions, I will confine myself to the relevant facts. I interviewed Ms. Thomas one evening in her home last spring. I decided it would be unkind—and unproductive—to quote her at any length lest she be taken (inappropriately) as representative by those who seek to discredit all Hollywood liberals and progressives. After the initial interview I received a series of invitations from Ms. Thomas to breakfasts at her house and other events, which I declined. In the end her participation in the piece was limited to a single sentence in which I described the vehicular population of the driveway on the night she had me over.
This judgment on my part has apparently angered Ms. Thomas. In her letter she accuses me of "looking to pump up [my] populist image, which took a major body blow this year when [I] plunked down a fat wad on a home in East Hampton." Ms. Thomas claims that when she asked about my own charitable giving—an exchange of which I have no recollection whatsoever—I replied, "M-m-my children go to public school!" This strikes me as not only logically inconsistent but impossible to imagine. After all, I have a single daughter, not "children"—something that is rather difficult for a father to mistake. As for that (quite modest) house, my family bought it not last year but nearly five years ago, thereby rendering Ms. Thomas's amateur psychologizing misplaced at best. In any case, I make no apologies. Nothing is too good for the working class.
I leave it to the reader to discern the accuracy of those aspects of her letter that are devoted to reading my mind.
New York, N.Y.
This article available online at: