In "People Like Us" (September Atlantic), David Brooks highlights the tendency of like-minded individuals to form homogeneous clusters. In that exercise he takes a somewhat passive approach to the phenomenon, implying that it happens over time and possibly without awareness. In the workplace this homogenization is a great deal less benign. Applicants for jobs, and the organizations to which they apply, often go through an elaborate dance to determine if they are compatible. This dance is performed through recruiting, plant visits, interviews, and so forth. If a compatibility of values is apparent (assuming, of course, that the basic credentials/ abilities/skills are also present in the applicant), an offer is made and often accepted. But things are not always as they seem. Organizations do not always portray their values accurately, and applicants have a tendency to provide a view of themselves most likely to result in an offer. Then the fun begins. In most cases the organization takes increasingly direct steps to eliminate the nonconforming hire. This might take the form of withholding information from the new hire to limit accomplishments, or providing stultifying or doomed-to-failure assignments. If these tactics don't work, the strategies become more direct—poor performance reviews, public criticism, and so on. Eventually the new hire "gets it" and leaves.
This model has been dubbed the attraction-selection-attrition model by behavioral scientists—most prominently Ben Schneider and his colleagues at the University of Maryland. Their model identifies the personality of a founder or a strong leader as the "core" of the homogeneity. A founder or CEO surrounds him or her self with people who share his or her values. These deputies, in turn, recruit and hire people who are similar to them (and, by extension, to the founder/CEO). Thus the organization becomes populated by people with the same interests and values, who tend to think alike (the "groupthink" phenomenon). While paying lip service to "diversity," the organization sets about creating and maintaining homogeneity.
Some implications of this model are interesting. Organizations commonly endorse changing their "culture" through various interventions (organizational change and development programs, seminars, workshops). In fact cultures change with new leadership. The legendary "turnaround leader," credited with single-handedly changing the culture, actually sweeps in like the lead biker in a pack of Hells Angels, collecting trusted (that is, like-minded) deputies from earlier corporate lives. So what actually happens is more like corporate cleansing than a Herculean effort by a single person to persuade incumbents to change their way of thinking. Bill Bratton became the new police chief in Los Angeles earlier this year. When he arrived, he announced that he was the "new sheriff" in town, and that anyone who didn't sign on to his vision of a reformed department should "put in their papers" (retire or quit). A similar phenomenon occurs with a merger or an acquisition. Two organizations vow to form a stronger union from the diversity of approaches and opinions of the leaders of those organizations. Six months later the weaker of the two leaders is gone, along with his or her deputies.
Schneider asserts that "people make the place." He appears to be right. Brooks observes the same phenomenon in nonwork settings. Central to both settings is the desire to associate with people who share one's values. Without the random variation of diversity, social groupings and organizations will inevitably succumb to external forces and fail to realize their respective potential. Diversity comes at a cost. This cost includes the willingness to acknowledge the value of "other" ways of thinking and doing and the possibility of dealing with the complexity of life rather than simplifying it by creating ideological walls. I am not as optimistic as Brooks about the possible solution to the problem. He suggests that the simple act of "experiencing" the diversity of the human condition (for example, go to Branson, subscribe to The Door, go to a megachurch) might be the first step—like trying a new cuisine. Values, interests, and ways of thinking are not that easily modified. What is more likely to happen is that this cultural "slumming" will simply strengthen the ties to one's own group.
Although David Brooks's analysis of Americans' segregationist impulses is largely on the mark, I can offer at least one example of hopeful integration. Brooks writes, "Maybe somewhere in this country there is a truly diverse neighborhood ... But I have never been to or heard of that neighborhood." At least one neighborhood in Philadelphia—the district known as Mt. Airy—is such a place. According to the 2000 census, West Mt. Airy, where I live, is 44.8 percent black and 52 percent white. On my block, with fewer than two dozen homes, one can find straights, gays, blacks, whites, Hispanics, Asians, young families with children, and retired couples. Mt. Airy may be a rarity, but an appreciation of diversity—not as an abstraction but as a part of one's daily life—can demonstrably be the force that attracts people to a community and sustains its character, just as progressive mountain bikers find a home in Boulder.
David Brooks asks that we sympathize with the poor, socially ostracized evangelical Christian who is not welcome in the halls of America's elite colleges and elsewhere. In laying out his case he makes the preposterous assertion "It's appalling that evangelical Christians are practically absent from entire professions, such as academia, the media, and filmmaking." I do not profess to be a great student of popular culture, but evangelical Christians seem to me to have in fact segmented the market just like any other interest group. The fundamentalists have their own universities, their own publishing industry (note the phenomenal success of the Left Behind series), their own bookstore chains, their own television networks and news programming, and thousands of small businessmen who use Christian symbols to advertise themselves to a like-minded customer base. Evangelicals are everywhere, and a lot of money is being made by these poor outcasts.
At Harvard in the 1950s, despite a wide-ranging effort to recruit students from many sources, I was one of less than a handful of students with a farm background. Equally sparse were evangelical Christians and political conservatives. The conformity cops were already in evidence: when the redoubtable scholar Mark DeWolfe Howe was invited to be faculty adviser to the college Republican Club, he responded that such a connection would do his career little good. Ultimately the dean's office allowed our club to function without the required adviser, possibly because it would have been embarrassing for the college that "broke McCarthyism at the stump" to have no one willing to advise Republicans.
As an academic in the late 1970s, I was told in several interviews that I was the ideal candidate for a job—if I were only a black female. I was in the job market because the university at which I had been teaching eliminated its medieval program in favor of something "more relevant." The designation "relevant" was dictated by administrators, not derived from student enrollees. That is how I came to be a sawmill operator.
Again and again in academia I was advised to learn a new specialty in order to remain employable, adding expertise in, for instance, early slave narratives or the roots of jazz. But I had pursued, entirely on my own dime, a childhood love of Malory, Chaucer, and sagas. In our national rush to honor the roots of people concerning whom we feel guilty, we are as a nation ignoring our own cultural, political, and institutional roots. I am not sure the result will be a very good polity.
Bruce P. Shields
David Brooks is certainly right that Americans tend to congregate with like-minded souls. But his examples all seem a little one-sided, suggesting that liberals are the ones who need to mingle with conservatives. I would suggest that the latter are no less in need of enlightenment. I count myself as liberal and progressive. I don't wear a sign, but I do wear Birkenstocks. Perhaps that's the giveaway, because on a trip through rural areas of Oregon last spring (areas that seemed quite conservative in culture) I was regarded with a suspicion that bordered on paranoia. Maybe the folks in Branson, Missouri, need to spend some time chatting with the folks in Boulder, Colorado. And how many liberal academics would be welcome at an evangelical Christian college? I'll venture a guess that their faculties are more "closed" than Princeton's. Everyone needs an open mind, not just liberals.
In his generally perceptive essay on diversity, David Brooks underestimates the extent to which patterns of residential segregation are matters of external circumstances rather than personal preference—in no respect more so than in the isolation of African-Americans. A large stack of solid research shows that if it were up to them, blacks would live in much more integrated neighborhoods than they do now. They—even middle-class blacks—remain highly segregated because of continuing racial discrimination in housing and because most whites accept only token levels of residential mixing with blacks.
Claude S. Fischer
University of California at Berkeley
David Brooks writes, "Brainy people with generally liberal social mores flow to academia, and brainy people with generally conservative mores flow elsewhere."
Although all generalizations are flawed, Brooks might consider the following, possibly more accurate underlying formulation: Brainy people pursuing knowledge flow to academia. Brainy people pursuing wealth flow to business.
David Brooks suggests that "any registered Republican who contemplates a career in academia these days is both a hero and a fool." Colleges and universities do not necessarily exclude conservatives in hiring. Equally likely is that the personal traits of heroism and foolishness that lead people to choose academic careers are more prevalent among liberals.
In most academic disciplines, especially the social sciences and humanities, salary prospects for professors are vastly lower than what similarly capable individuals with M.B.A.s, rather than specialized Ph.D.s, could earn in the private sector. Why do some people still choose academic careers? They do so either because of the perceived prestige of the professoriat or because of the satisfaction they derive from teaching, advising, and doing basic research. In exchange for personal satisfaction, academics compromise on salary.
This do-gooder's willingness to give up private gains for public benefits is a hallmark of liberal philosophy. Analogously, liberals tend to favor more taxation and greater public provision of goods and services. Conservatives prefer to keep more of their money for themselves, and to advocate less government involvement in everything. If you'd rather not give up potential earnings in exchange for the opportunity to mold young minds and create new knowledge for society, and would prefer to take a greater percentage of your compensation in the form of money, you are likely to avoid academia.
The preponderance of liberals in our colleges and universities does not stem solely from the preferences of hiring committees. Even in the absence of any hiring-committee prejudices, potential professors do not randomly "flow into academia" or "flow elsewhere." These are smart people, and they are more than able to compare the benefits and costs of different career choices and select the career option that most appeals to them. Academia tends to be more appealing to liberals.
Trudy Ann Cameron
University of Oregon
David Brooks is exactly right that people everywhere tend to group themselves according to common backgrounds and interests, and that this is a phenomenon less of snobbery than of people's making life easier for themselves in an understandable way. But this tendency of human nature is precisely why a republic must work so hard to ensure that citizens of all backgrounds hold something in common that goes deeper than what they watch on TV or their opinion of motor homes. The veterans of World War II may not have returned to live harmoniously in a caste-free society, but they all acquired the humbling (and uplifting) sense that they were peers who sacrificed for their country. A similar experience is exactly the remedy for the relatively innocent stratification Brooks describes. This could come in state-based "militias," where all young men and women of a certain age spent a year in tight, forced camaraderie, digging ditches, fighting fires and floods, and doing any other crucial grunt work of the moment. The important thing is that no one be exempt from the same regimen of sacrifice, hard work, and discipline for an entire year of his or her life. Back in civilian life, it's a safe bet that Ivy Leaguers or Teamsters would once again cluster together—but perhaps with a greater degree of basic democratic respect for their fellow Americans. In other words, promoting knowledge of one another's diversity might be the best that a democratic society can do, short of actual "diversity," and that's certainly a step ahead of where we are now.
Palo Alto, Calif.
I enjoyed H. W. Brands's attempt to demystify the Founding Fathers, bring them back to earth, and put them on a more properly sized pedestal ("Founders Chic," September Atlantic). Remembering that they were real human beings, whose substantial mistakes haunted us later on, is crucial if we are to put our own challenges in perspective and recognize that we are just as able to muster what it takes to accomplish great things.
However, on the final page Brands posits the reaction that the Founders would most likely have toward several contemporary constitutional controversies, and I have to take serious issue with his supposition about one in particular: campaign-finance reform. He believes that "all the Founders would have been shocked at the overwhelming role of money in modern American politics."
I haven't run the numbers, but I suspect that campaign spending relative to per capita GDP in the eighteenth and early nineteenth centuries wasn't nearly as far from today's standard as Brands seems to believe. Additionally, congressional districts have astronomically increased in size and population. Candidates for the first Congress could personally meet and interact with a substantial number of their voters. At a minimum, the geography and limited population of the day meant that the vast majority of voters at least knew somebody who knew the would-be congressman very well, even if they didn't know him personally. The opposite is true today, and that means a whole lot more money must be spent to introduce candidates to the electorate.
Toss in the added cost of running a presidential election across five time zones, and you start talking about real money.
And that's just the beginning.
Remember also that U.S. senators were not popularly elected under the original version of the Constitution. Brands decries our contemporary timidity about altering the Constitution to address issues such as campaign spending, but he does not examine how this past modification affected that very issue. Leaving aside population and geographic growth, popular election of senators at least doubled the campaign activity and spending necessary to make representative government work at the congressional level.
Then there are the modern media: increasingly fractured, narrowly tailored, ever more expensive to use as an advertising tool, and absolutely essential for reaching a national audience. Brands nicely points out that Ben Franklin would be most impressed by the amazing technological advances of the modern world. I think Franklin would be pretty quick to add up the price of a thirty-second national TV spot, toss in the cost of the political consultants needed to solicit news coverage for a candidate on thousands of broadcast stations and publications, and then divide by the number of voters, concluding that the bill for seeking the presidency computes just about right.
Similar dynamics are at work all the way down the ballot, and you can also throw in the additional cost of running elections in the thirty-seven states that have been created since the Founders' era (that's seventy-four more U.S. Senate elections, if you're keeping score).
My inclination is to believe that the Founders would be shocked by the sheer number and cost of perks for modern officeholders. Leaving aside the huge staffs at the federal level, many states now have well-compensated full-time legislators with individual support staffs and free office space. Beating them at the ballot box necessitates that a challenger counter all that taxpayer-subsidized name identification.
Ultimately, given the enormous size of the modern American economy, the cost of running elections is still a tiny fraction and a good bargain. The drafters of the Constitution could not possibly have envisioned the electoral system that would be needed to serve such a huge and powerful nation, but they would have been hard pressed to create one that did the job any better, or with more resilience and dynamism, than what we now have. Indeed, even today I haven't seen anyone offer a persuasive alternative. This is one area in which we should put the Founders on a pedestal and admire their genius.
As suggested in "Founders Chic," the American people ought to propose and discuss "bold," positive, and thoughtful constitutional amendments on the federal and state levels, and then allow them to proceed through the full public debate prescribed in the constitutional democratic procedures, taking their chances on success just as was done in 1787-1788 for the Constitution and the Bill of Rights and for proposed amendments since.
Likewise, the American Civil Liberties Union should use its ample energies and funds to do the same with changes that it espouses. The real complications and time involved in approving such amendments are purposeful and wholly sound.
What we should all now stop doing is using "judicial activism" in the courts to evade the proper constitutional-amendment procedures. The U.S. and state supreme courts should use due restraint when faced with issues and situations that suggest the need for fully considered constitutional amendments, and recuse themselves in favor of our prescribed and proven historical democratic processes. America will be all the better for it.
John A. McVickar
Your September article by H. W. Brands usefully reminds us that the Founders were not as highly regarded by their contemporaries and some subsequent generations as they are now. But I disagree with his suggestion that they may not have been so exceptional after all.
Cast an eye over the history of American administrations, and you find a dreary succession of Pierces, Fillmores, Clevelands, and Carters, rarely relieved by a towering figure like Lincoln. I can think of only two examples of an entire leadership cadre (as opposed to a solo act like Lincoln) that quite consciously set out on a historically important process of constructive statesmanship and carried it to success, both by the leaders' own standards and by the consensus of history. The Founders were the first, of course; the second was the remarkable generation—Marshall, Acheson, Kennan, Bohlen, and others—that fashioned the postwar settlements in Europe and Japan.
Those achievements are the more remarkable in the broader sweep of history when one considers how rarely the most talented leaders have applied themselves to positive statecraft rather than to conquest or personal aggrandizement.
Charles R. Morris
New York, N.Y.
I am unconvinced by H. W. Brands's attempt to take the Founders off their pedestal. He gives us some vituperations from contemporaneous political opponents, which should be beneath notice. And I can't seriously blame the Founders for failing to end slavery when their business was creating a new country on new principles—or for the undoubted imperfections here and there in their work.
Brands is more persuasive when he writes about defects in the Constitution that we cannot bring ourselves to see or to change. One great defect is that the mechanism for amendment is so difficult and cumbersome. This often drives reformers to seek back-door amendments through court decisions. Another is that the mechanism for resolving disagreements about interpretations of the Constitution was not spelled out, leading to the slow growth of the doctrine that the Constitution means anything the Supreme Court says it means. The Founders would be heartbroken to find that the First Amendment has been twisted to protect pornographers and flag burners. They would be astounded to find that the Supreme Court has discovered a constitutional "right to privacy" that is simply not there.
We should remember that the Founders objected not to an aristocratic government but to an alien aristocracy of birth rather than a domestic one of merit. If Franklin wanted the President to serve without salary, as Brands writes, that means that Franklin wanted only independently wealthy men to be President.
The Electoral College is undemocratic, H. W. Brands contends, and would not be included if the Constitution were written today. But imagine the 2000 election if the Electoral College did not exist. How quickly could the votes in fifty states have been counted, recounted, and counted again? You can be sure that in any state in which the vote was close, like Florida, the vote would be recounted ad nauseam, and we would still be complaining about the outcome. The value of the Electoral College is precisely in very close elections.
I suspect that H. W. Brands is correct in many ways when he suggests that we have gone a bit overboard in our reverence for the Founders. The way we "interpret" the Constitution is usually based on what we would like it to mean. That said, Brands makes one critical comment with which I take exception: "By the 1820s grave sins of omission hung ominously over the country: the Founders' failure to deal with slavery, and their failure to specify whether sovereignty lay with the states or with the nation."
I believe that some of the Founders (Adams and Franklin) wanted very much to outlaw slavery and also to have national sovereignty, whereas others, most notably Jefferson, would entertain no such thought. All preferred half a loaf to none at all. Had either school of thought chosen to press for a clear-cut statement, the Constitution could never have been passed by Congress, or ratified by the states. The Founders had to keep the desired end result—independence—as their primary goal, and hope that succeeding generations would have the gumption to take care of the other issues. But for their perseverance we might today be British subjects—or, worse yet, French. Painful as it was, we have taken care of both issues.
Eau Claire, Wis.
I disagree with H. W. Brands that "in revering the Founders we undervalue ourselves and sabotage our own efforts," because in addition to personal background and training it will probably take an alarming crisis to bring out those leadership traits we so admire. Unfortunately, although troubling times make a great, readable history, I don't know how many would actually want to live through them.
H. W. Brands replies:
Ken Braun thinks America is getting good value for its campaign dollars and suggests that the Founders would think so too. I can't disagree more. In the first place, the Founders didn't envision a democracy, and therefore didn't anticipate that candidates would have to be "introduced" to voters by the millions. Most of them assumed that the candidates would already be well known to the relatively few people able to vote. But even if they had envisioned our modern democracy, they would hardly have been happy with the enormous role of money in making it work. If the Founders agreed on one thing, it was the importance of civic virtue in a republic, and though—conceivably—some of them might have considered our current mode of campaign finance efficient, they wouldn't have considered it virtuous. George Washington pressing the flesh at fundraisers? The mind boggles.
John McVickar dislikes what he calls "judicial activism." The Founders probably would have disliked it too, but for a different reason. They had no idea the courts would become a branch of government co-equal to the legislative and executive branches; that was the work chiefly of John Marshall (whom Jefferson despised). But where most critics of judicial activism appeal to some "original intent," on grounds that the Founders got it right and we shouldn't mess with their work, they themselves would have rejected this notion for the ancestor worship it is.
Henry Adams asserted that "the progress of evolution from President Washington to President Grant was alone evidence enough to upset Darwin." Charles Morris seems to agree. Yet Adams's conclusion was—and mine is—that this demonstrates not that every subsequent generation was inferior to the Founding generation but that the most qualified individuals in the later generations didn't go into politics. The question for our generation is how to get the best people to run for office. One way to start is to remove the prerequisite of being able to raise tens of millions of dollars.
Roger Burk says that Franklin wanted only financially independent men to be President. True enough. Franklin believed they would be less corruptible—in all senses of the word—than those who still had to earn their living. Two centuries of American practice haven't proved him wrong. The Founders might well have been astounded to see the First Amendment sheltering pornographers; but they would have been less upset at its protection of flag burners, because they had considerably less reverence for the flag than we do. As for a right to privacy, it's difficult—for me, at any rate—to read the Bill of Rights and not conclude that an underlying right to be left alone by government is just what they had in mind. This was much of what these Lockeans fought their revolution for.
Joyce Timm suggests that the Electoral College protects us from unending recounts. Maybe not. It is the very rare presidential election that is really close in the popular vote. In the narrowest popular tally in the past hundred years (1960), John Kennedy beat Richard Nixon by more than 100,000 votes. What makes recounts tempting under our current system is that a handful of votes in a swing state can deliver a grossly incommensurate number of electoral votes. In 2000 Gore won by 540,000 votes nationwide; there would have been no recount.
Karl Holbrook thinks I'm too hard on the Founders for not ending slavery and not making clear where sovereignty lay. Perhaps so, but my point was less to criticize them than to challenge the notion that they got nothing wrong. Having said that, I still wish the opponents of slavery had pushed harder at Philadelphia in 1787 to end not only the slave trade but slavery itself. Slave traders were given twenty years to recoup their investments; a similar sunset arrangement—perhaps lasting the life of persons then enslaved—might have been devised for the underlying institution. It would have been difficult, but far less difficult and costly than the method that finally settled the issue in the 1860s.
Don Ryan makes a good point that the qualities we admire in the Founders were elicited by the crisis through which they lived. It's worth remembering, though, that they brought most of the crisis upon themselves. There didn't have to be an American Revolution; Samuel Adams, James Otis, and the others might have paid their stamp taxes and sipped their East India Company tea, and America might have acquired independence slowly, the way Canada did. As a longtime admirer of our northern neighbors, I can imagine worse scenarios. One positive side effect might have been the peaceful elimination of slavery (accomplished in the British Empire in the 1830s).
As an "at-home" mom, I read with great interest and enjoyment Caitlin Flanagan's article "Housewife Confidential" (September Atlantic). My only problem with this article is that it perpetuates common misrepresentations of The Feminine Mystique.
I, like most of my generation, had heard an awful lot about this book and thought I knew its message without actually reading it. From reading this article, I suspect that Flanagan falls into this category as well. Her references to this book seem to include it as part of the "standard [feminist] cant," with an emphasis on career and disdain for housewifery that this article criticizes. In fact (as I now know, thanks to my book club) the bulk of the article actually echoes and supports much of what The Feminine Mystique says.
For instance, Flanagan writes,
The general idea ... is that shortly after President Truman dropped the big one on Nagasaki, an entire generation of brave, brilliant women ... was kidnapped by a bunch of rat-bastard men, deposited in Levittown, and told to mop. That women in large numbers were eagerly, joyfully complicit in this life plan, that women helped to create the plan, is rarely considered.
That may be true of what many zealous feminists, past and present, have reported, but their stimulus was probably not The Feminine Mystique, which devotes a full chapter to the subject and supports Flanagan's conclusion that after the war both men and women actively embraced the ideal represented by Ward and June Cleaver. It was the eventual failure of this so-called "ideal life" to fulfill women that led to the "problem without a name" that Betty Friedan documented.
Flanagan also presents us with a short description of the life of Erma Bombeck, as a counterargument to the narrative of oppression and boredom imposed by the women's movement (to paraphrase Flanagan). However, Bombeck's life as presented in this article seems to repeat precisely the pattern recognized and documented in The Feminine Mystique: the career paths available before the war; the postwar dream of marriage, family, and home life that did not include career; the subsequent boredom and dissatisfaction: "Finally ... those dreams [of husband, child, house in the suburbs] came true. And she began to go absolutely bonkers." The fact that Bombeck found fulfillment when she began to write again (and, interestingly, that her writing was limited to the subject of housewifery) is fully in keeping with the insights of The Feminine Mystique.
East Norwich, N.Y.
Caitlin Flanagan's otherwise warm and insightful piece on housewives and at-home moms is marred by her repeated flailing at feminist straw men (okay, straw women). The feminists and working mothers I know have a far more nuanced appreciation of their own housewifely mothers, their at-home peers, and even Erma Bombeck, than Flanagan evinces in return. This is not a story that needs a villain.
Caitlin Flanagan's paean to Erma Bombeck is very misleading in one way: the at-home lifestyle Flanagan touts is one that she is able to lead without sacrificing as most at-home parents do.
By describing herself as an at-home parent, Flanagan pointedly positions herself in contrast to working parents. Note, however, that she writes magazine articles for pay, in addition to caring for her children. Some would argue that this does not fit the definition of at-home parent and that Flanagan is a parent who works from home. (Does this definition make me an at-home parent from five o'clock to eight o'clock each evening?) Even if you reject this argument, it's certainly true that she has taken on the mantle of at-home parent while avoiding loss of her career, her current personal income and potential future earnings, her Social Security set-asides, her outside intellectual stimulation, and her professional standing. For someone with a paycheck and a career of her own to regularly imply that housewifery is bliss itself is not just misleading, it's insulting. I wonder why Flanagan insists on claiming this at-home parent status. Could it be that she knows it gives her credibility? In reality she's taking credit for the sacrifices other people have made—and then wondering aloud why some at-home parents are so angry.
I have just finished reading Caitlin Flanagan's article on Erma Bombeck and other women who wrote on the theme of running a home and raising children during the 1950s, 1960s, and 1970s. I was excited to read the article, because it covers a subject dear to me (I am thirty-three years old but first read The I Hate to Cook Book as a small child, and re-read it still). I really enjoyed the piece—but I wonder: Why didn't Flanagan cite Shirley Jackson's real masterpiece, Life Among the Savages? Raising Demons is its sequel; Life Among the Savages was published in book form in 1953 (four years before Raising Demons and Please Don't Eat the Daisies), after appearing as articles in Charm, Collier's, Good Housekeeping, Harper's, Mademoiselle, Woman's Day, and Woman's Home Companion. To not mention that title is a real oversight, and I am confident that other Atlantic readers will be as pained as I am by this title's going unmentioned in this otherwise wonderful article.
New Haven, Conn.
Caitlin Flanagan replies:
I'm sure that Jennifer Vomvas and her book group read The Feminine Mystique closely, and that they found in it—as I did when I first read it, more than twenty years ago—a sympathetic account of how the World War II generation arrived at its decisions about family life. If she had read my essay as closely, she would have seen that it attributes the current prevailing attitudes about postwar housewives not to Betty Friedan but, rather, to certain contemporary writers and filmmakers. Further, Ms. Vomvas suggests that Erma Bombeck's life stands as an enduring tribute to the wisdom of The Feminine Mystique. I'm not so sure. In her book Friedan describes the great housewife writers as being "like Uncle Tom, or Amos and Andy" (as the book group may have noticed, The Feminine Mystique never rings a bell if a gong is handy; in it Friedan devotes a chapter to comparing middle-class American housewives to concentration-camp victims, an analogy so ghastly that she apologizes for it in her 2000 autobiography, Life So Far). In fact, Bombeck always humorously maintained that she owed her career to an unpleasant encounter with Friedan. In 1964 she and a group of her friends went to hear the famous author give a speech; to their dismay, Friedan harangued them and insulted their lives. Bombeck went on to read The Feminine Mystique, in which she found her favorite housewife writers similarly maligned. Just a few weeks later she began her incomparable career.
And the rest, as Allan Fisher might say, is herstory.
I think Jennifer Debner's comments are intended to make me feel bad—but they've had the opposite effect. Apparently, I—alone of all my sex—am having it all! Bless her for giving me this agreeable new perspective. Eva Geertz is quite right that Life Among the Savages was published before Raising Demons, although I'm not sure leaving it unmentioned was an "oversight."
Paul Davies's "E.T. and God" (September Atlantic) was thoughtful but completely wrong when arguing that the discovery of extraterrestrial life would undermine Christianity's conception of God's special relationship with man. Discovery of alien life would profoundly, perhaps brutally, reiterate Christian doctrines.
First, Christians learn in Genesis that God gave mankind stewardship over the earth, not the cosmos. Christians believe that man is the highest creation on earth, not that he is the highest creation. There are higher beings who sometimes have served as God's messengers, or angels, to man.
Second, Christianity recognizes Jesus as God, man, and savior. But this special relationship between God and man does not rule out the existence of life on other planets, nor would it require, as Davies suggests, the existence of "saviors" for other species of intelligent life. Christians understand that God gave Jesus to humanity because humanity was a fallen species, not a successful species.
For Christians and Jews, the natural, incarnate world is the first revelation of God's existence. This revelation is more persuasive when the universe is encountered as generously made, with laws that continue to confound and amaze us as they are discovered. Davies's skeptics describe a "God of the gaps," who is squeezed out by man's growing knowledge, but Christians worship a God who grants man a widening light of understanding—and a growing circle of darkness, wonder, and mystery.
The discovery of an intelligent alien race would be an opportunity for enlightenment and error. The first question would be whether this race had escaped the same fall from grace that mankind had experienced. Other questions would follow. What could mankind learn of creation and the Creator from the aliens, and what might man share in turn? As with every discovery, the aliens would also pose new opportunities for error and sin, but nothing would change Christian doctrine on man's relationship with God.
The discovery of extraterrestrials will undermine pride, not faith. Power, primacy, and pride were the temptations faced by Jesus Christ, and these are inherent, recurring temptations for mankind. History, however, is a long lesson that Christians worship a God "not of this world," whose "ways are not man's ways."
Christianity has coped with many events that dismayed its adherents, including the discoveries of evolution and orbital mechanics, the invention of the printing press, the postponement of the Second Coming, the fall of imperial Rome, and the Resurrection. It is the secular humanists, I submit, who will be most challenged and overawed by extraterrestrial life.
Gerald E. Nora
Vernon Hills, Ill.
Paul Davies fails utterly to fathom the cosmic inclusiveness of Christianity. For Christians, the central religious metaphor is a crucifixion. God is decisively "personified" for us in the narrative of a human being bleeding and dying on a cross for the sake of both friend and enemy. The greatest of Christian theologians have long recognized that the Crucifixion requires neither recurrence nor reinterpretation for its significance and consequence to be manifested in other cultures, other worlds, and other creatures we may encounter. A love that bleeds for us, the Bible affirms, bleeds for all creation and never ceases such self-oblation. As Pascal wrote, "Christ will be in agony until the end of the world."
The Crucifixion is thus not some mechanical salvific transaction. The Anglican Bishop John Robinson rightly maintained, "The New Testament does not affirm that in Christ our salvation 'becomes possible.' It affirms, rather, that in him what has always been possible now 'becomes manifest,' in the sense of being decisively presented in a human [life]."
Davies distorts Christianity by insisting, too, that by its lights God's saving love is limited to human beings: "Jesus Christ was humanity's savior and redeemer. He did not die for the dolphins or the gorillas, and certainly not for the proverbial little green men." This is simply misguided. "Incarnation is the manner and mode of all of God's work in His world," Cardinal Berulle wrote, adding, "The incarnation is the condition, the work and the mystery wherein God reigns and whereby he reigns in His creatures." (Note: "creatures," not "human beings.")
Likewise a Protestant theologian: "The whole is Incarnation ... The doctrine of the Incarnation must be set in the context of a world order which is a manifestation of God, in all the stages of its evolution."
Davies reveals, finally, a surprising ignorance that these comprehensive, cosmic themes are not simply the development of later theology but are often quite explicit in the New Testament. Saint Paul, in the Epistle to the Colossians, for example, affirmed that in and through God's self-offering love, "all things were created, in heaven and on earth, visible and invisible, whether thrones or dominions or principalities or authorities—all things were created through him and for him. He is before all things, and in him all things hold together." (Colossians 1:16-17)
"In Him all things (all creatures) cohere." Christianity may not always live up to this utterly inclusive theology, but if it is faithful to the biblical witness, it should expect to encounter the same loving, "bleeding" Creator deeply incarnate in alien cultures and extraterrestrial intelligence.
The Reverend Nils Blatz
The Church of the Redeemer
In his book Cosmos (1980), Carl Sagan says, "There are some hundred billion galaxies each with on the average a hundred billion stars, there are perhaps as many planets as stars ... it seems to me that the universe is brimming over with life." To Christians this is a bizarre idea. Ten million Crucifixions on ten million planets does not fit with the spirit of the New Testament. Paul Davies attempts to find a fit. He does admit that Buddhists and Hindus would not be threatened by the prospect of advanced aliens, owing to their pluralistic concept of God. But he gets his reasoning wrong. The religions of Buddha and Lao-Tze and the Jina are godless religions—any god is decisively rejected. No gods, no problems with aliens. Davies also implies that some sort of spirituality, some religious feeling, seems to be a part of human nature. On the contrary, it is the absence of natural belief that has led to the attempt to "prove" beliefs. And "proof" is what Davies attempts. He invokes astrobiology and a so-called revamped design argument and says, "The more one accepts the formation of life as a natural process ... the more ingenious and contrived (dare one say designed?) the universe appears to be." He forgets that a religious leader (I know not from what planet) warned, "One does not pour new wine into old skins."
Incline Village, Nev.
Paul Davies assumes that intelligent extraterrestrials (ETIs) would of necessity function morally as human beings. That is, the lives of ETIs would be characterized by deceit, anger, greed, and the like. But what if these beings never "sinned"? What if the whole concept of salvation were to them a foreign and totally unnecessary idea? What if the earth is populated with the only creatures in the universe actually requiring redemption? Imagine us inhabiting a fallen world on the wrong side of the cosmic tracks, and the rest of the universe considering it wisdom to leave us alone, effectively quarantining Mother Earth. This idea of humanity's sinfulness being both alarming and unique is the thesis of C. S. Lewis's remarkable trilogy, written half a century ago.
Another speculation of Davies's centers on the supposed limitations of Christianity with regard to evangelism of other sentient species. According to this view, Christianity is too parochial, too earth-centered, to even consider such a possibility. Basic Christian doctrine, in fact, teaches its followers that one nonhuman, wise, and powerful species shares the universe with us, and is devoid of a savior. This population is the angels, good and bad. The majority of the angels exist sinlessly and need no salvation. For those angels who "fell" there is no remedy, and they remain forever cursed by their Creator. Humanity's opportunity for redemption is especially dear when viewed from the angelic perspective.
Samuel V. Rowe
Fort Pierce, Fla.
Paul Davies says that the discovery of a more spiritually advanced species might prove problematic for the Christian doctrine of atonement, and as far as most creedal Christians are concerned, this may well be so. But at least for Latter-day Saints (Mormons) this needn't be a problem. Although it is not necessarily an official doctrine, many Mormons believe that other universes have existed, are probably in existence now, and may exist in the future. They believe that Jesus Christ's atonement was infinite over time and space, probably "covering" those other universes as well. Some debate the issue, and plenty speculate, but the question is at least asked. This may be because Latter-day Saints are not "Biblicist," or bound to believe in an inerrant Bible, as many conservative Protestants are. I also suspect that liberal Protestants and Catholics would accommodate in fairly short order as well to the discovery of such a species. The only area that might be problematic for Latter-day Saints is that we believe we are literally in God's image, rejecting the amorphous God as a Hellenistic accretion during the centuries following the apostolic era—and are therefore of the same "species," although, as Davies points out, if the universe follows the anthropic principle, we may all be of a family, and a this-worldly biological term like "species" would obviously be anachronistic. Perhaps a term like "meta-species" would have to be coined for such a transcendent and all-encompassing relationship.
Marc A. Schindler
Spruce Grove, Alberta
Although Paul Davies reports that some non-Christian religions have long settled the matter of extraterrestrial life, at least one Christian denomination also has room for such a discovery—the fast-growing Church of Jesus Christ of Latter-day Saints (whose members many refer to as Mormons). As a longtime member of that Church, I have been taught from my childhood not only that intelligent extraterrestrial life exists, but that God created those beings for the same purpose he created us—"to bring to pass the immortality and eternal life" of his children, as Mormon Scripture puts it. God's creations living on other planets also have souls to be saved. The notion that Jesus Christ's sacrifice could extend beyond this world (and, indeed, resurrect our planet's dolphins and gorillas in a zoophilic heaven) is not troubling to a faith that believes that his atonement is both infinite and eternal—far surpassing the limits of an earthbound humanity. Instead of being a theological challenge, the discovery of extraterrestrial life would likely support Mormons' esteem of the faith's nineteenth-century founder, Joseph Smith. This wouldn't be the first astronomical finding to uphold Smith's views. "Worlds without number have I created," Mormon Scripture records God as saying—a statement published generations before the current flood of discovered extra-solar planetary systems. Although astronomers contemptuous of religion would surely have no truck with the Mormons' nineteenth-century farmer-prophet, the irony is that their discoveries make both Smith's version of the universe and his vision of Christianity more relevant rather than less.
I enjoyed immensely the thought-provoking "E.T. and God." However, I take exception to Paul Davies's statement "It is more likely that any civilization that had surpassed us scientifically would have improved on our level of moral development, too." Although I would certainly share Davies's hope that would be the case, I see nothing in our modern world that suggests we have developed morally in proportion to our scientific development. In fact, I think an argument could be made that the opposite is true. Davies reveals one of the flaws in the scientific approach to theology when he speculates that "an advanced alien society would sooner or later find some way to genetically eliminate evil behavior, resulting in a race of saintly beings." What is so saintly about a life devoid of evil behavior in a world where there is no evil behavior? Surely Davies does not think that a world without evil behavior is reflective of the deepest human aspirations. I shudder at the thought of who is going to decide what evil behavior to genetically eliminate. I am afraid that Davies expresses too great a confidence in science and its role.
Paul Davies makes a common but crucial mistake in "E.T. and God" when he states that Christianity is concerned only about the salvation of human beings. Christianity teaches that Jesus Christ died for human beings to save them from sin, in order that they might participate in the redeemed new creation. In other words, God does not save human beings from creation; God is saving his creation including sinful humanity. God loves the cosmos (John 3:16); in Jesus Christ, God reconciles the cosmos to himself (Corinthians 5:9); Jesus Christ reconciles to God all things on earth in Jesus Christ (Ephesians 1:10). All things in heaven and on earth would certainly include dolphins and gorillas and even the proverbial green men.
Davies concludes with an apt reference to Giordano Bruno. Bruno was infamously killed because he taught that scientific discovery would bring about the demise of Christianity and the revival of the cult of the Sun God Ra. Is there not an echo of this assertion in this article?
The Reverend David H. Miley
Rock Island, Ill.
Extraterrestrial intelligent beings: they would certainly create a greater theological controversy than current debates over gay bishops in the Episcopal Church or women and noncelibate priests in the clergy of the Catholic Church. However, Christian churches would have—by thinking outside the box or the Bible—at least one powerful argument to support themselves in the face of such a discovery. Of course, the discovery of ETI beings would have to be proved beyond any doubt, preferably in person, before Church authorities would have to respond with this view.
At the time of Jesus's Crucifixion no one in the Roman Empire, I dare say, knew of the existence of human beings in China and the Far East, let alone in the undiscovered continents of the Americas, the islands of Polynesia, the continent of Australia, and numerous unvisited places in Africa. For nearly another fifteen centuries (which may have seemed as long a duration as light-years seem to us) many of these peoples would remain as unknown to Christians as extraterrestrial life is to us. Following the explorers' discovery of these lands and peoples, Christian missionaries provided the Gospels to the natives. Jesus did not have to be crucified and resurrected in person again for each of these peoples to provide their souls with the means toward salvation, any more than the billions of earthlings who have been born after the first generation of Christians have required personal witness to his teachings and Crucifixion. In fact, Christian theology has already had to deal with the problems of all the people who died before Jesus's appearance as well as all those people who have died since the preaching of the New Testament without ever having had an opportunity to even hear the Gospels. Basically, future Christians facing ETI beings can take the example of their predecessors, the Jews, and claim to be God's chosen people, and offer ETI beings the good news of salvation—assuming they don't already have their own messiah and revealed word of God. In the latter case another clash of faiths may emerge.
The discovery of intelligent life with foreign belief systems has already occurred many times here on earth. At least among cosmopolitan human beings, religions are learning from and adapting to one another. When Jews, Muslims, Christians, Buddhists, and the odd scientist break bread together and seek understanding (alas, too rarely), how would it be different to include E.T.?
Martin S. Ewing
One point that Paul Davies could have emphasized is the relative rarity (based on our current knowledge) of the conditions (and sequence of conditions) required for the formation and evolution of intelligent life and the development of a technological civilization. Consider, for example, how earthly evolution was affected by asteroid impacts. This in turn suggests that the likelihood of there being such a civilization close enough to us in space and time to make its presence known is exceedingly remote, seti efforts to the contrary notwithstanding. Accordingly, if I were a Christian theologian, I would not be overly concerned about the possibility. For all practical purposes, we indeed may be alone in the universe.
Del Mar, Calif.
Paul Davies posits that the discovery of alien life—especially nonhuman sentients—might occasion a crisis of faith among the religiously minded. It happens that first-contact scenarios are a staple of science fiction, and an exhaustive collective effort has been made to examine almost every conceivable first-contact problem, not excepting spiritual ones. For example, in his essays C. S. Lewis considered many of the problems Davies raises and also penned novels around some of them (Out of the Silent Planet, 1938; Perelandra, 1943).
Aliens might be fallen or unfallen. If fallen, their fall could take a form different from man's, and so too with their redemption. Several stories deal with the spiritual aspects of first contact by sending priests along with explorers. Larry Niven and Jerry Pournelle's The Mote in God's Eye (1974) takes this approach. The exemplar of this literature is James Blish's A Case of Conscience (1958), in which the Jesuit member of a survey crew struggles with the question of whether there can be goodness without God, and what obligations are imposed on Christians confronted with evidence thereof. Roger Zelazny's A Rose for Ecclesiastes (1963) describes the influence that human religious thought might have on an alien race. Closer to home, Lewis's friend J.R.R. Tolkien, in The Lord of the Rings (1954) and The Silmarillion (1977), described earthbound sentients who are spiritually aware but not gifted with immortal souls. A fair amount of science fiction also deals with this problem in artificial intelligence. An early and amusing example is Anthony Boucher's The Quest for Saint Aquin (1951).
The fact that these scenarios have been considered doesn't mean that there would not be any crisis of the sort that Davies posits, but it does suggest that there would be a reserve of spiritually minded people—in addition to Davies and his readers—who have considered these issues and have a certain comfort level with them. In this I suspect that spiritual crises are much like other crises—some people find ways to accommodate the problem within their existing mental framework, whereas others abandon that framework. Moreover, since science-fiction writers do not shy away from any aspects of alien contact (the aliens, it is said, generally come to save you or to eat you), it is interesting that these stories often end optimistically from the point of view of the person of faith, suggesting a strain of religious thinkers who would be as excited by a first contact as their less adaptable co-religionists would be disturbed.
In response to Patrick J. Buchanan's remark that few Americans would call for Howard Dean in the event of "some terrorist horror on the scale of 9/11" ("Four More Years?," September Atlantic), I would like to point out that the majority of Americans didn't call for George W. Bush, emergency or not, and for many of us the "tough guys who win wars" have been nothing short of a disaster. The admiration for tough guys that Buchanan claims Americans possess reminds me of lines from Sylvia Plath: "Every woman adores a Fascist / The boot in the face ..." To me and many others, a truly tough guy wouldn't have responded as Bush and his colleagues have. Buchanan is ostensibly undertaking a dispassionate survey of the political landscape, but by reducing the vast problem of terrorism to the level of a schoolyard fight, he achieves the larger goal of marginalizing opponents of Bush's "war on terror." I think what Buchanan is saying is that he believes Americans prefer simplistic answers to complex problems. Some Americans may have this preference, but for those of us who don't (and whose voices on this subject are largely ignored), I would like to point out that in the event of an emergency, many Americans would like to call out, "Get me Howard Dean!"
San Francisco, Calif.
In "Four More Years," Patrick Buchanan states that history (among other factors) makes the defeat of George W. Bush in 2004 appear "improbable." I think he ignores some other historical data, which—should history repeat itself—would show Al Gore defeating Bush by a resounding margin.
Bush is the third direct descendant of a former President to be elected—the first two having been John Quincy Adams and Benjamin Harrison. All three were elected despite losing the popular vote. Both Adams and Harrison were soundly beaten for re-election by the popular-vote winners they defeated.
Look at more similarities. The 1824 election was a four-way race, and Adams was second to Andrew Jackson, who did not have enough Electoral College votes. This sent the decision to the House of Representatives. Henry Clay, who finished third, threw his support to Adams, and was thereby appointed Secretary of State.
The 1888 election, between Harrison and Cleveland, was probably the most corrupt in history, with allegations of vote buying and more. Cleveland won the popular vote by a majority of 90,000, but Harrison prevailed in the Electoral College.
We know that a large majority of Gore supporters believe that the 2000 election was stolen. Gore has announced that he will not run in 2004—but here are even more precedents for his not only winning but becoming a two-term President. He will have dodged a problem by losing in 2000, because sitting Vice Presidents who are elected usually serve only one term.
Pat Buchanan replies:
Bill McClanahan, his mind clouded by his dislike of the President, both misstates the facts and misses my point.
First, the war on terror, in which George W. Bush is commander-in-chief, is an undeniable success. Al-Qaeda has been decimated and run out of Afghanistan; many of its leaders and soldiers have been arrested or killed; and not a single act of terror has been perpetrated on U.S. soil since 9/11. If that's not success, what is? If the President has blundered, it was in launching an unnecessary war in Iraq, which I opposed.
Second, McClanahan says I reduce "the vast problem of terrorism to the level of a schoolyard fight." False. I simply make the point, with which few analysts disagree, that a President perceived as tough will almost surely benefit in the face of a foreign attack late in an election. JFK's defiant stand in the Cuban missile crisis ended Republican hopes for gains that fall. In the first months of the Iranian hostage crisis the nation rallied behind Commander-in-Chief Jimmy Carter, enabling him to easily dispatch Senator Ted Kennedy, who had been running ahead of him. If Howard Dean is nominated, McClanahan had best pray that al-Qaeda does not perpetrate some horror in late October of 2004—or Howard is toast.
As for Phyllis Humphrey, her history is good up to a point. John Adams's son and William Henry Harrison's grandson did each serve but one term, but the latter was defeated by a hugely popular former President Grover Cleveland, and the former was defeated by the foremost hero of that day, who had not only routed the British at New Orleans but also stolen Florida from Spain. Sorry, Phyllis, but Al—whatever he did for the Internet—does not call to mind that earlier Tennessean Andrew Jackson.
As an anthropologist who teaches about the Middle East and Orientalism, I found Christopher Hitchens's review of Edward Said's Orientalism ("Where the Twain Should Have Met," September Atlantic) bizarre in its opacity. It is Hitchens who missed an opportunity to do what he wants Said to do—shed some insight on cultural-political interactions. But Said's task was of a different sort—to understand why such a dialogue has not been possible. Said observes the West observing the East: "The Orient is not only adjacent to Europe; it is also the place of Europe's greatest and richest and oldest colonies, the source of its civilizations and languages, its cultural contestant, and one of its oldest and most recurring images of the Other."
The Middle East and North Africa have been and are still colonized by Euro-Americans extracting natural resources, displacing peoples, erasing traditions and histories, causing destruction. The reverse has not been true. Said cannot "negotiate" these power disparities. He can only discover the intellectual traditions that foster an idea of the Orient as being other than the Occident—unchanging, mysterious, childlike, inferior. That negative images are found mirrored in both East and West is not relevant to the Orientalist construction of positional superiority, especially when it is backed in practice by superior technologies.
The trouble with Hitchens is that he is stuck in his own partial, confused understanding of present-day Iraq. His zeal leads him to make absurd statements, such as "Saddam Hussein was better able to force himself on my attention than I ever was to force myself on his," without seeming to understand that he can thank U.S. foreign policy for supporting Hussein's brutal regime. Zeal causes him to ignore facts about the alliance of exiled dissidents with covert CIA monies in the 1990s, and his self-confessed lack of Middle Eastern scholarship makes him vulnerable to a charge of glib personality polemics.
In his article about the supposed failure of Edward Said to mediate properly between East and West, Christopher Hitchens writes, "But for some reason—conceivably connected to his status as an exile—he cannot allow that direct Western engagement in the region is legitimate."
Is the exile status of Said the most obvious explanation here? What about the West's history of "engagement" in the region—one especially fine example being the U.S.-British installation of the Shah after overthrowing the democratically elected government in Iran to get control over Iranian oil? And yet from where Hitchens sits, the Said fiction that is American Orientalism "doesn't seem that restless [that is, restless enough to want an oil pipeline in Afghanistan] ... it asks only that the Afghans leave it alone."
Oh, the poor wronged innocent that is the West! I find it hard to understand how Hitchens can give the West, especially its incarnation as the present Bush Administration, such a free pass, when he gave Clinton and even Mother Theresa no quarter. He uncovered or constructed for us every conceivable deviousness of theirs. What went wrong, Mr. Hitchens, with your powers to make transparent?
When Christopher Hitchens asks what went wrong in Edward Said's failure to explain the East to the West, he seems to suggest that there is a valid alternative explanation to the obvious state of the Middle East (those mainly Arab and Muslim states around the Mediterranean), and that the West should have a hand in providing it. I'm afraid that if it looks like a duck, walks like a duck, and quacks like a duck, a duck it likely is. On all the evidence, even of its own institutions, intelligentsia, and media, the Middle East is primarily ignorant, cruel, unjust, and unproductive in the entirety of its civic, personal, and religious expression. For all his apparent residual, instinctual left-wing leaning to mythomania, Hitchens surely accepts that today's East is doomed to extinction. Is he too dangerously hopeful? Where in history has there ever been a recovery from the depths of failure in which the people of the Middle East find themselves today? Sadly but clearly they are falling under the weight of their own inadequacies, not ours, and the West should follow America's lead in not letting them drag us down with them.
Unfortunately, the anti-Israel and anti-American views of Edward Said have infected not only his students but also the university where he teaches. Christopher Hitchens correctly points out that Said could have been a bridge between the Arab world and Western civilization but instead has supplied a platform for dismissing Western values in favor of a radical form of pan-Arabism that threatens our relations with this region of the world.
His actions in attempting to stone Israeli soldiers protecting Israel's northern borders with Lebanon was only the tip of the iceberg in what has degenerated from anti-Israel actions to overt anti-Semitism. Certainly he has been a credit neither to the teaching profession nor to Columbia University. His retention is one of the best arguments against tenure at our institutions of higher learning.
Silver Spring, Md.
I would like to commend Christopher Hitchens for his sensitive and insightful analysis of Edward Said's work. With regard to German colonialism, Hitchens is correct to point out Wilhelm II's visit to Damascus as an example of a German Drang nach Osten, but noting this has little relevance for Said's views of potentially Orientalist German literary production of the early nineteenth century, since both Goethe (1832) and Friedrich Schlegel (1829) died before there was an official German imperialist project (and, indeed, before there was an imperial Germany). Nevertheless, that Germans were eager to participate in colonial enterprises prior to the 1871 unification of Germany is ably documented by Susanne Zantop in her book Colonial Fantasies: Conquest, Family, and Nation in Precolonial Germany 1770-1870. None of this, however, absolves Said of his misrepresentation of Goethe's West-östlicher Divan—a work that Hitchens aptly characterizes as "one of the most meticulous and respectful considerations of the Orient we have"—as an Orientalist text. Whereas Said states that "the essence of Orientalism is the ineradicable distinction between Western superiority and Oriental inferiority," Goethe in the Divan repeatedly refuted this, such as when he wrote, echoing the Koran, "The Orient is God's! / The Occident is God's!," thereby equating the two parts of the world.
Said's misunderstanding of West-östlicher Divan begins with its title, which he repeatedly gives in Orientalism as Westöstlicher Diwan, a mistake that Hitchens unfortunately reproduces in his review. The proper title is West-östlicher Divan, "Divan" being a transliteration of the word based on Persian phonetics, whereas "Diwan" follows the sound of the same word in Arabic. The greatest influence on the Divan was the poetry of the great Persian poet Hafiz, which Goethe had read in German translation and which inspired him to compose his own Divan.
More significant is Said's omission of the hyphen in West-östlich, a hyphenation chosen by Goethe himself. In eliminating the hyphen, Said negates the possibility of living between two worlds, something he himself has done quite successfully. As Goethe wrote in lines not included in the Divan but possibly meant for it:
Also zwischen Ost- und Westen
Sich bewegen, sei zum Besten!
(Therefore between East and West
To move oneself, that is the best!)
Without entering into a debate on the suitability of Iraqi exiles in the governance of postwar Iraq, I want to share in Hitchens's "hope of cultural and political cross-pollination between the ... Middle East...and the citizens of the Occident." Culturally, the finest example of that cross-pollination may be Goethe's West-östlicher Divan. And just as the Eastern poet Hafiz inspired the Westerner Goethe, so, too, has the Divan inspired one of the Muslim world's greatest poets, Muhammad Iqbal, spiritual founder of the state of Pakistan, to compose his Persian Message of the East: An Answer to Goethe's 'Divan'.
St. Louis, Mo.
Christopher Hitchens replies:
I have a feeling that Laura Nader would have found my review "bizarre in its opacity" even if she did not write "as an anthropologist who teaches about the Middle East and Orientalism." And I have spent enough time teaching in Berkeley to be familiar with people whose academic world view explains less and less about what actually confronts them.
I do not think, to begin with, that Professor Said would thank her for saying that he claims no role as interpreter or "negotiator." There is an immense volume of work under his name that suggests the contrary. His most recent essay at the time I wrote, in Al Hayat (August 25, 2003), says, "As far as the Middle East is concerned, the discussion must include Arabs and Muslims and Israelis and Jews as equal participants. I urge everyone to join in and not leave the field of values, definitions, and cultures uncontested." If only one side in this argument had anything to learn, Ms. Nader might conceivably hope for a chair in the study of the region.
Her reverse-Orientalist dogma has no means of explaining the alliance of the Turkish empire with imperial Germany, any more than it can account for the current colonization by post-Ottoman Turkey of Christian and European Cyprus. Nor does she care to engage me when I point out the obvious—the demand by the supporters of Osama bin Laden (and significant others) that the old imperial caliphate be restored. When she states, then, that "the reverse has not been true," are we to understand that she forgets the historic Muslim invasion and colonization of the Balkans, or of Spain or Greece, or the frantic nostalgia for same? Or may we suppose that this is news to her? In that case it must seem doubly odd that American military power recently prevented the extermination of the Muslims of Bosnia and Kosovo, who are themselves the descendants of that famous "interaction."
The role of the United States in all this has been paradoxical and contradictory: it indulges Muslim Turkey in Cyprus, for example, just as it overindulges Messianic Jewish settlement in Palestine. Woodrow Wilson opposed the Anglo-French carve-up of the caliphate; Franklin Roosevelt and John Kennedy were friendly to the decolonization of North Africa; and Dwight Eisenhower and John Foster Dulles forced a defeat on the last European "punitive expedition" at Suez in 1956. (The most actively colonial of the current European powers, the France of Jacques Chirac, continues to intervene in Africa and the Middle East with great promiscuity while ranging itself solidly against regime change in Iraq.) Those who lack any ironic or dialectical ability may still wish to ask what Iran has done with its oil resources under the quarter-century rule of a stupid, aggressive, sterile theocracy that once dispatched state agents to murder an Indian novelist living on "Euro-American" soil.
I think that I can fairly claim to know that the United States government more than once demonstrated culpable laxity toward Saddam Hussein's system—a system that was something more than merely "brutal." But this point on Nader's part is made only to be discarded in bad faith—because her real animus is obviously against those who argued successfully that such a policy was wrong and ought to be reversed. If, to her, the two positions are equally "Orientalist," then I think it's plain that she has collapsed into tautology.
Nina Sakun obviously spared herself the reading of that part of my review which dealt with the Shah of Iran and the deserved ignominy of his fall. It would probably seem simplistic of me if I said that there was a considerable difference between an intervention to install a tyrant in Iran in 1953 and an intervention to remove one (and to prepare for an election) in Iraq in 2003. But that's how simple-minded I am prepared to be. I feel comparatively sophisticated, nonetheless, when contrasted with those who think that the polity of the United States or the United Kingdom consists of an unchangeable and single personality, incapable of evolution over six decades. Even so, I do not regard "the West" as a "poor wronged innocent." That's a description I would reserve for the 3,000 or so people of all nationalities and cultures who were murdered on one working day by the envoys of international Islamic zealotry (and for the innumerable and presumably "Eastern" Afghans, Pakistanis, Algerians, and Iranians, among others, who have been murdered or enslaved or repressed by Talibanism and its emulators).
Keith Hancock and Nelson Marans seem to suffer from different but related forms of sectarianism—too narrow and particular for this large and complex argument. Mr. Hancock forgets that the failed-state diagnosis he cites has recently applied with equal force to Orthodox Christian Serbia and Catholic Croatia, both of them just as proximate to the Mediterranean. But there were democratic and pluralistic forces buried under that ruin and now gradually re-emergent, just as there are in Iraq and Kurdistan and Turkey and Egypt and Algeria. Our task is to keep faith with these very elements, and not to see them vulgarly denounced as "Western puppets." This task is made infinitely more difficult by voices such as that of Nelson Marans, who seems to feel that dissent from his view of the Palestine question is sufficient warrant for dismissal from a tenured position at an American university. I might not have chosen Professor Said's method of celebrating Israel's overdue departure from its illegal occupation of Lebanese soil, but he had a perfect right to this contemptuous form of expression, and it did not affect in the least his responsibilities at Columbia, which he has always discharged with exemplary scruple and measure.
I say this in spite of the fact that, in the Al Hayat essay cited above, Professor Said accuses me of "racism at bottom" for my review, and with heavy sarcasm asserts that I glory in Anglo-American imperialism and wish to punish the "wooly-haired natives." I'll refer those who may still harbor this view to my Atlantic review on the subject of colonial partition (March 2003). And I'll say again that I think Said's work has been undergoing a qualitative degeneration from what was once a very exalted standard.
One of the pleasures of writing for this magazine is the arrival of letters such as that from Kamaal Haque, whose expertise I should have drawn upon before daring to set pen to paper on this subject.
William Langewiesche's article "Anarchy at Sea" (September Atlantic) certainly rang my bells. During the 1980s I served as a technical adviser at the International Maritime Organization for the International Transportworkers Federation, a labor organization with "observer" status that addresses various seamen's issues including safety. It seeks to be the "conscience" of the IMO. Here is what I learned:
First, shipowners, through their various nations, are in control of the agenda and results. Since most of the world's shipping is represented by owners in comparatively rich and technologically advanced countries, they vote their interests. Flag-of-convenience countries never deviate from the shipowners' program. Voting power is apportioned on the basis of "tonnage"—that is, how much ship weight is under a flag. Thus Third World countries and those with little tonnage have little say in either the process or the outcome.
Second, the IMO's work is primarily driven by profit, for both shipowners and their respective nations' interests. For example, many new rules specify a technological solution to problems that many countries can ill afford. Satellite-communications units are mandated for safety, but only a few countries manufacture them. Countries with struggling economies are hard pressed to raise their standards to meet rules and regulations demanding expensive technology. These countries can neither manufacture nor service the required equipment. For technologically advanced countries mandated equipment is an economic bonanza.
Third, the IMO reaches its conclusions through "consensus." It tends to take an idea or procedure that has the possibility of bearing fruit and whittle, modify, and expand the descriptive text until what was an apple appears to have become an over-ripe tomato. The product pleases everyone because it accomplishes nothing but appears to be progressive. It is a display case with no real merchandise.
Fourth, the IMO is a plum assignment, especially if you are from a disadvantaged country. It overlooks the Thames across from Big Ben, in London. Lunches are delicious and priced at about a third of their street value. Like other organizations in which what you see is inversely proportional to what is produced, the IMO is also a welfare program for its staff.
Finally, the IMO neither regulates nor enforces. Although lots of paper is generated, most of the work dealing with it falls on overworked ships' crews, which are constantly being reduced in size. The only thing that the ships' masters and the companies want to hear is "Your papers are in order." Enforcement of rules and regulations depends entirely on which nation-state is doing the enforcing. It is spotty, with few consequences for noncompliance. Likewise, inspections are a joke. It may be worth remembering that the CEO of Exxon's marine division retired shortly after the Exxon Valdez disaster to take the helm of the American Bureau of Shipping. The ABS is represented in ports around the world and certifies seaworthiness.
After twenty years as a seafarer, ten of them representing seafarers, it is difficult not to be cynical about maritime "regulation." Little will be done about ship safety, piracy, or opportunities for maritime terrorist activities in the foreseeable future.
Donald M. Dishinger
Panama City Beach, Fla.