"How Arafat Destroyed Palestine," the cover headline for David Samuels's article "In a Ruined Country" (September Atlantic), seemed wholly inconsistent with the text. Yes, we learn that Yasir Arafat diverted aid and business money to his private accounts and ran Palestine from "small black notebooks" carried in his vest pocket. But that is hardly unique for Third World potentates, and is particularly understandable in the chaos of modern Palestine—a fragmented, impoverished land of displaced persons. The funds were owned in theory by the Palestinian Authority, but Samuels tells us there really was no Palestinian Authority. Arafat, no organizer, could not assemble an instant government, but in his secondhand clothes and openhanded manner he was able to give hope to a poor and dispirited people. Now Arafat's sequestered $1 billion to $3 billion is being returned from these foreign banks, Samuels reports.
If the question is "Who destroyed Palestine?" a string of more obvious choices leap to mind—Begin, Netanyahu, Sharon, et al., whose rapacious settlement policies have dashed any hope for even a token Palestine.
In the matter of diverted funds, a Bill Moyers piece for PBS last year suggested a poignant similarity between the total cost of the Jewish settlements in Palestine and the hundreds of billions in U.S. aid to Israel. Maybe it is the American taxpayer who unknowingly destroyed Palestine.
It may be a sign of public ignorance, or perhaps failure of the media, that Yasir Arafat's leaving Palestine bereft of principles, administration, and finance is considered news. Certainly the situation was plain long before his unlamented demise. Bill Clinton and Ehud Barak gambled that Arafat was capable of ceasing to be a thug, but alas for his constituency, Abba Eban's witticism that the Palestinian leader never missed an opportunity to miss an opportunity proved true to the very end. The Palestinians needed a Lincoln, wished perversely for a conqueror, and misplaced their faith in a vicious and self-interested Third World kleptocrat.
Yet the thesis that Arafat destroyed Palestine is questionable. Although it is unfashionable to say so, Palestinian society is backward. Why else would Arafat be lionized? It is a society that revels in victimhood, glorifies death and terror, and has been content to remain a backwater and dependency, eschewing reasonable opportunities to seek peace or become productive. The lack of countervailing pressure to construct a genuine civil society during Arafat's undemocratic reign, and the failure of such a society to emerge subsequently, is not a symptom: it is the disease. Arafat is as much a product of his culture as its author. The pessimists are justified.
Andrew J. Green
Overland Park, Kans.
David Samuels's article unfortunately creates an inaccurate impression about key facts regarding Lombard Odier Darier Hentsch & Cie, a private bank based in Geneva, Switzerland. For example, Samuels states that the Palestinian Authority's commitment not to use its funds placed in LODH "for any war or aggression oriented activities" might "have given a more-cautious banker pause." It was LODH, recognizing the sensitivity of the situation, that insisted on this commitment; the bank took on the PA as a client based in part on the positive environment following the Oslo Accords and the deep involvement of two former senior Israeli intelligence officers as financial advisers to the PA. Likewise, contrary to Samuels's statement that LODH "agreed to set up the account on the spot," the process took months. Finally, the article inaccurately states that financial controls over the account were removed in June of 2000.
LODH would like to stress that at all times, from its initial due diligence until the closing of the subject account, it met or exceeded its legal and professional responsibilities. Given the politically sensitive nature of the engagement, LODH took special care in vetting, maintaining, and ultimately terminating the relationship. LODH made tough decisions both to open the relationship in the optimistic period that followed the conclusion of the Oslo agreements and to terminate it during the second intifada. Although the current state of affairs in the Middle East is extremely unfortunate, any suggestion that LODH acted without due diligence is false.
If Samuels had contacted us before finishing his article, it would have afforded us the opportunity to correct these inaccurate statements.
Jérôme KoechlinHead of Corporate Communications
Lombard Odier Darier Hentsch & Cie
W ill David Samuels accept a small correction? Munib al-Masri—now the richest man in Palestine—was employed by Phillips in Algiers and also in Beirut, but he was not "the head of Phillips Petroleum operations in Algeria." My father, who spent his entire career with Phillips and was head of its Egypt operations from 1963 until 1970, says that al-Masri, who would have been twenty-nine in 1963, would have reported to Silvio Èha, who was the man in charge. Whether the promotion was Samuels's error or an exaggeration of al-Masri's, a fact's a fact.
Cynthia Beck Croasdaile
Greenwood Village, Colo.
In his fascinating portrayal of the late Palestinian leader, David Samuels indirectly presents an important question. If it is true, as the "official Palestinian Authority committee" concluded, that 43 percent of the "state budget" was embezzled and only 9.5 percent of it was actually spent on the needs of the Palestinian people, were U.S. and European officials aware of this situation? If so, what did they do in response? If not, why not? Should they be held accountable in some manner?
Ethan S. BurgerWashington College of Law
David Samuels hit the nail on the head. As he notes, Arafat's successor, Mahmoud Abbas, recently built a four-story mansion on land intended to be a public park. Clearly, Arafat's legacy of corruption and dishonesty continues.
Abbas's official television stations continue to broadcast extreme incitement, even as Abbas tells the United States that has stopped. Many of the recent would-be suicide bombers (whose capture goes unreported in the United States) belong not to Hamas but to Fatah, Abbas's own movement. Abbas could easily have pursued the small and violent group Islamic Jihad, but he refused to confront it—let alone the larger Hamas.
On political issues Abbas has stuck to Arafat's hard-line positions. He demands that descendants of Arab refugees be resettled in pre-1967 Israel, not the West Bank (Israel suggested the latter). He rejects the rights of the millions of Mizrahi Jewish Israelis, who are descendants of a similar number of Jewish refugees from Arab lands. In demanding that pre-1967 Israel absorb both Jewish refugees from Arab lands and Arab refugees caused by the Arab wars against Israel, he clearly rejects a two-state solution.
It will take decades to undo the legacy of indoctrination started by Arafat. The steps taken by Abbas so far are not encouraging.
David Samuels replies:
The interviews I conducted in Israel and Palestine, and the extensive documentation I have seen, all suggest that Lombard Odier embraced the proposal to accept Arafat's money during an initial meeting with Ozrad Lev in Geneva in early 1997. The "due diligence" that followed that meeting seems to have consisted of a few meetings in Switzerland during which the structure of the account was hashed out and a subsequent trip by the banker Richard de Tscharner to Ramallah (where he was wined and dined by Rachid and was given a model of the al-Aqsa mosque as a gift by Arafat). In a formal letter to Lombard Odier dated June 2000, Rachid instructed the bank to remove the financial controls that Lev had insisted on—including the stipulation that funds from the account could be withdrawn only to a heavily monitored Palestinian Authority account in Ramallah.
It is easy to understand why, after four years of violence and thousands dead, the Swiss bankers who handled Arafat's money are so eager to assert that the bank "met or exceeded its legal and professional responsibilities." I doubt that the people of Palestine, whose money wound up in Switzerland, or the people of Israel, who suffered the effects of the brutal terror war that Arafat helped finance and direct, would approve of Lombard Odier's code of professional ethics. Obtaining a letter from a client promising not to engage in "war or aggression oriented activities" would qualify as a reasonable precaution against such activities only in fairyland. Surely a reasonable person might be led to wonder why a bank would do business with a client from whom such a letter was judged necessary. The suggestion that the bank's decision to accept Arafat's money was a noble deed undertaken in the optimistic spirit of the Oslo Accords beggars belief.
Mr. al-Masri is a gracious man who did his best to communicate with me for four hours in a mixture of English and Arabic. Any errors that crept into my shorthand version of his résumé are my responsibility.
By virtue of his very own "New Hampshire advantage"—high name recognition and favorability ratings in the Granite State's southern half—Mitt Romney ("The Holy Cow! Candidate," September Atlantic) is a more likely Republican nominee for president than most Massachusetts Democrats would have you believe. But even those who recognize his potential understand that his candidacy is likely to be derailed by two concerns, neither of which Sridhar Pappu addresses in his largely flattering profile.
The first is Romney's "evolving" position on abortion. As Pappu notes, Romney—having learned from his father's experience—has taken to being "very, very careful" in his choice of words. Yet in 1994, while campaigning against Ted Kennedy, Romney argued that abortion should be "safe and legal in this country." Though Pappu gives Romney a pass on his past statement, conservative activists are unlikely to do so.
The second is Romney's lack of foreign-policy experience. Simply put, save working with the Utah Public Safety Command during the 2002 Olympics, Romney has none. After 9/11 Republican primary voters may not be willing to elect a commander in chief with less practice in international affairs than Howard Dean.
I read with amusement that Mitt Romney, asked how he would prevent the death penalty from being imposed on an innocent person, replied, "By having a higher standard of care—something beyond 'reasonable doubt.' It should be clear and convincing evidence." I note that he went to Harvard Law School in the early 1970s, so perhaps he can be forgiven for misquoting the law he learned so long ago.
As any second-year law student can tell you, the standard of proof in a typical civil case is "preponderance" (51 percent) of the evidence. In a more complicated civil case it is "clear and convincing" evidence—think of the Exxon Valdez disaster. The standard of proof for guilt in a criminal case is significantly higher: "beyond a reasonable doubt." In other words, if the evidence in a criminal trial were no greater than "clear and convincing," a jury would be directed to return a "not guilty" verdict.
Romney bent over backward to point out that he learned from his father the importance of being careful what you say. If his standard of "clear and convincing" evidence were actually the rule in death-penalty cases, we'd be executing nearly every person convicted of murder. I don't think his dad would have approved.
Long Beach, Calif.
Sridhar Pappu says that Governor Romney makes policy decisions based on data, but he fails to note that Romney does so selectively. In the most recent Massachusetts budget sessions Romney eliminated funds for the Department of Education's Safe Schools program, which involves anti-suicide and anti-violence initiatives to protect gay, lesbian, bisexual, and transgendered teens. His budget vetoes do not reflect scientific data: although teenage suicide rates have decreased nationally, the scientific community has established that GLBT youth remain at greater risk for both suicide and violence. Perhaps Romney wishes to rein in spending (this line item amounted to 0.0014 percent of the approximately $22 billion budget), but this veto suggests that he either uses or ignores data to suit his political agenda.
Romney also eliminated language from the DOE's budget—language that had been included for years—that codified recommendations from the state's board of education on the safety of GLBT students. This particular political maneuver, one of many meanspirited actions by the governor, stands in sharp contrast to Romney's sincere use of words like "gosh" and "neat."
Tony L. Baptista
In the final section of his article on Mitt Romney, Sridhar Pappu asked the governor about his LDS temple garments and characterized himself as "uncomfortably" posing the question.
Of course he was uncomfortable. That's what happens when you ask something grossly inappropriate and then publish it in a national forum.
I was almost enjoying Bernard-Henri Lévy's rather meanspirited observations on American life ("In the Footsteps of Tocqueville," July/August Atlantic) until I came to his discourse on "creationism." As a former anthropology teacher, I've been following the debate over the teaching of evolution for some time. Despite his erudition, Lévy goes seriously off course when he smears the biologist Jonathan Wells as part of a Moonie conspiracy against evolution. And he wrongly lumps together all critics of Darwin as religious crackpots who use pseudo-science to recalculate the age of the earth and deny the fossil record.
Rather than engage in religion-baiting, I suggest that Lévy actually read Dr. Wells's book, The Icons of Evolution (2000), which takes a revealing look at the strengths and weaknesses in the Darwinian view of evolution—and reveals how the weak points in the theory are routinely ignored by many textbooks. Copiously documented with citations to the primary scientific literature, Wells's book has been vindicated on many of its points, and grudging corrections have had to be made to a number of textbooks.
The neo-Darwinist version of evolution (random genetic mutation and natural selection) is currently accepted by most biologists, but it is not a scientific slam-dunk. Unfortunately, it is defended by many scientists with a sanctimonious dogmatism worthy of the medieval clerics who persecuted Galileo. And the scientifically uninformed predictably frame the debate as one of bubble-headed Bible-belt fundamentalists versus the educated, the rational, and the sane. In fact, the unsolved mysteries surrounding the origin and evolution of life should intrigue and humble us all—including Lévy.
I began Lévy's Tocqueville odyssey with much interest and hope. My interest was in how a European would interpret and present today's American culture; my hope was that he could do it without reflecting too many of his own biases.
Those biases, I'm afraid, are making these otherwise well-written accounts increasingly hard to read. Nowhere is Lévy more difficult to read than when he takes on the interpretation of evangelical Christian thought and values. From his condescending account of his Willow Creek church experience (May Atlantic) to his editorializing on creationism, Lévy is incredibly subjective and even small-minded. His notion that those who believe God may have played a part, along with evolution, in producing the world we see around us are part of "the subtlest, most underhanded, most cunning, and at bottom most dangerous ideological maneuver … in years" would be humorous if he were not serious. Does he truly believe that some kind of conspiracy is involved here?
What finally tipped my exasperation over the edge was his editorializing on what he describes as American society's "profound evolution in the direction of the extreme right, definitively turning its back on its European heritage."
Where has Lévy been getting his information? Does he know that the members of this so-called extreme right, by which he must mean evangelical Christians, are the very ones who man and fund many of the large humanitarian efforts in this world? A brief reading of history will also tell him that evangelicals were (and still are today) among those who fought the hardest against slavery, religious intolerance, and the abuse of women's rights worldwide.
Lévy should be informed enough to know that there are millions of evangelical Christians around the world, most of whom have the same values and beliefs as those in America. To classify all of them as somehow "extremist" is, to say the least, sadly short of open-minded investigation and interpretation.
As an incarcerated reader conversant with Bernard-Henri Lévy (I read his Barbarism With a Human Face during my first term, in the late 1970s, and saw the feature on him in Vanity Fair a couple of years ago), I'm writing to dispute his points about the differences between private and public prisons (July/August Atlantic). Many of the observations he made about what happened to be a corporate-owned prison could have applied to any state-owned prison as well.
What inspired my writing most of all was the irony that unions are much more powerful in France than they are in the United States; but the growth of the "prison-industrial complex" has allowed guards' unions to become powerful enough to control policy. Certainly in California, which has the largest individual state prison system in the United States, the union has caused amenities to be added and then removed as suits it. It also spearheaded the "rehabilitation doesn't work; it is all about the victims" mentality, which created a "three strikes" law that was almost defeated last year—until Arnold Schwarzenegger, who claimed he would stand up to the guards, reneged. In quieter fashion Schwarzenegger has already stopped paroling long-overdue lifers and resumed reincarcerating parole violators for minor technical violations.
My point being: the standout thing about America's current penal system is that it provides a false economic baffle of jobs, and that the politics of being "tough on crime" have now led to the guards' having so much power that policy change seems to be impossible even after the media shine the spotlight on all the waste of money and lives. I still appreciate Lévy's writing about prisons as if they mattered, but to your average American—even your average Atlantic reader—I'd have to say they demonstrably don't.
Shane WilliamsMetropolitan Detention Center
Los Angeles, Calif.
In his review of Animal Rights and Animals in Translation ("If Pigs Could Swim," September Atlantic), B. R. Myers comments on baiting bears in Maine, a practice he describes as "shooting them at close range, frequently in the back." While that's certainly true, and is best described as ambushing rather than hunting, baiting has some other dire consequences as well.
Maine's bear-feeding program, sanctioned by the state's Department of Inland Fisheries and Wildlife, is designed to produce more bears for out-of-state hunters, who killed 70 percent of bears taken in 2004—86 percent of them over bait. The task is made a bit easier for them because the state allows bait—stale jelly doughnuts and used cooking grease, among other delectables—to be placed thirty days before the beginning of the season, to lull the animals into a false sense of security. The result reduces a sport to a shooting gallery. This approach to game management also increases the possibility of disease transmission among bears and other animals that feed at bait sites; and it increases reproductive rates and "trains" thousands of bears to rely on human food.
So why does Maine—which also retains the dubious distinction of being the only state to allow bear trapping—condone this practice? It's not science or sportsmanship; it's economics. In addition to the revenue generated by hefty nonresident hunting licenses, the department's 2004 management report points out that out-of-state bear hunters frequently employ guides for large fees.
Ironically, the report also states, "Never feed bears under any circumstances." Maine's wildlife agency should follow its own advice.
Don LoprienoWildlife Alliance of Maine
B. R. Myers trots out the same old line one often hears from those who do not actually know any hunters: that they are just in it for the kill. Although his characterization may be true for some, I can assure him that the vast majority of the 25 million Americans who enjoy hunting do so for many different reasons, the thrill of killing not usually among them. Myers asserts that bears are often shot in the back. While I'm not sure I understand why that is objectionable, no sound hunter would shoot a bear in the back unless there was no other choice. A bear, like most other animals, is most effectively and humanely killed by a shot to the vital organs. A bear's thick hide, large spine, and larger rib cage make the back a very poor choice.
B. R. Myers's blanket condemnation of hunters is simplistic. Some hunters take pleasure in killing, but for every anecdote Myers recounts about the "thrill of the kill," I can find a thoughtful hunter for whom hunting is an expression of the unavoidable way in which our lives are lived at the expense of wild animals. This intertwining of life and death is just as true for vegans as it is for hunters.
Although Myers is right that animals "are treated worse here [than in Europe] because we hardly think of them at all," this does not go far enough. The increasing cruelty of our industrial animal-production factories is the result of relentless commodification: animals have been turned into "production units." Part of our refusal to deal with the cruelty of this progressive and unending process comes from the assumption that markets produce rational results, with an almost invisible semantic slide from what is economically rational to what is moral. Nothing about this assumption is conservative. It is corporatist and consumerist, two sides of a pact with the devil.
J. Claude EvansAssociate Professor of Philosophy and
Environmental StudiesWashington University
St. Louis, Mo.
B. R. Myers's ignorance of hunting is absolutely breathtaking, as is his statement "It is obvious that the real attraction of these 'sports' is the thrill of the kill." I grew up in an upper-middle-class family in which no one hunted; I didn't go redneck until my early thirties. Now I am a passionate bow hunter for white-tailed deer, which means I see at least twenty deer for every one I could have any possibility of shooting. I have hunted in Adirondack deer camps with men who spend all year waiting for the season. When it finally arrives, they leave the easy deer hunting on the farms all around them in the lowlands and journey up to a cabin in the High Peaks region, where the deer density is only a tenth as great. They are looking for the most difficult deer they can find: old, skittish, spooky mountain bucks that you would never see in daylight at all were it not for the rut causing them—possibly—to let down their guard the tiniest bit as they search for does. Deer hunters like me will walk fourteen miles a day over rough terrain and count themselves lucky to get one shot every year or two. Don't tell me they're in it for the thrill of the kill. It goes deeper than that, all the way to the bone. Sure, some hunters are slobs. So are some journalists.
In his concluding paragraph B. R. Myers chides, "So it is that we now lag behind even the Spanish in animal welfare; and when the Turks get into the EU, we will lag behind a Muslim nation as well." The implication, of course, is that Muslims are morally and ethically inferior to Americans in general; that the United States should be ashamed not to meet the animal-protection standards in countries such as Turkey, Jordan, and Indonesia. In fact, the Islamic tradition sets very strict standards both for the care of animals and for their slaughter. This tradition calls for calming the animal before slaughter and shielding other livestock from the sights and sounds. Moreover, throughout the Muslim world factory farms have begun to take hold of the livestock industry only recently—owing to initiatives by USAID and other international aid agencies that serve the interests of corporate partners. Within the United States itself, Muslim Americans are organizing with interfaith groups to provide alternatives to factory farms. In doing so they hope to support the rights not only of animals but also of local farmers and farm workers who have become the prey of giant corporations. One can only hope that Myers might consider Muslims' own anguish over the greed and corruption rampant in the food industry, rather than ignorantly assuming their ethical backwardness.
Interim President, TAQWA EcoFoods NFP Chicago, Ill.
It is refreshing to see an article that speaks so honestly about the plight of animals in this country and shows appreciation for the fact that all animals, both human and non-human, are to be respected for our different strengths, weaknesses, and forms of intelligence. Thank you for choosing to publish it.
Grand Rapids, Mich.
B. R. Myers replies:
Robert Hancock and Bill Heavey both claim that people hunt for reasons other than bloodlust, but neither can name any; Heavey's letter simply describes the lengths to which men will go for the thrill of killing a particularly "difficult" animal. Hancock explains that the "sound" hunter would not shoot a bear in the back because doing so is inhumane and ineffective. Since Hancock does not deny that bears are often shot in the back anyway—nor does he even understand why the cruelty is objectionable—his remarks only show how far the "sound" hunter is from the rule.
As for J. Claude Evans's letter, the notion that people kill wildlife to make a philosophical statement about the "intertwining" of life and death will probably elicit the same amused reaction from Hancock and Heavey that it did from me. Ian Straughn somehow contrived to miss the whole point of my article, which is that we Americans are not as humane as those foreigners whom we consider particularly cruel to animals. My remark about Turkey tied in with a mocking reference earlier in the piece to press reports of GIs' rescuing dogs from "mean Islamic streets," reports that implied—wrongly, as I made clear—that animals are better off in America.
But I think all these letter writers will agree that the American farm animal in its walled-in hell would be happy to change places with a wild bear, even one unfortunate enough to live in Maine.
In her review of Unraveled ("The Great Escape," September Atlantic), Sandra Tsing Loh wonders "how much insight the former Interview editor Mark Matousek can claim to have into mothers everywhere, as he is famously gay."
Not only does her statement do a disservice to gay parents, but it is also sexist. Why assume that gender, sexual orientation, or status as a procreator precludes insight into parenting? After all, what mature adult doesn't know and appreciate the sacrifices his or her parents made? Gay or straight, what manipulative daddy's girl or damaged mama's boy doesn't have a handle on what makes that parent tick?
The term "famously gay" reinforces a truism in our society: the only thing worse than being homosexual is to be gay and not be ashamed.
Though I typically enjoy Sandra Tsing Loh's witty and refreshingly self-deprecating pieces on the trials and tribulations of upper-middle-class contemporary motherhood, I was disappointed with "The Great Escape." In particular, I take issue with the portrayal of Tolstoy's Anna Karenina in the cheeky discussion of what Loh terms absentee motherhood.
While Loh clearly grasps the fatuousness of the actress Megan Mullally's remarks on Anna's inability to procure Paxil, she falls far short of providing an intelligent and plausible reading of the text and the reasons for its widespread and lasting appeal. The Oprah minions have ironically adopted the novel and its protagonist as an argument for what Tolstoy—an outspoken proponent of marriage, who argued that a woman's unique and foremost obligation was to her children—would surely have deemed morally scurrilous. This reading of Tolstoy—myopic and entirely untenable when considered in light of the author's canon, biography, and known views, not to mention generally accepted criticism of his works—should be condemned among educated, literate women as the antithesis of all things empowering. If liberation necessitates as specious justification such a dilettantish, herdlike misreading of a text renowned for its painfully eloquent argument in favor of moral and religious self-discipline, perhaps we are doing something wrong.
In three years spent working toward a degree in Russian language and literature I have been consistently baffled as to why so many female readers (myself included) are enamored of a writer overtly on the unpopular side of the "woman question" (see his foreword to Chekhov's short story "The Darling" for a quick bit of proof). I can only suppose that we recognize in Tolstoy an uncanny command of female sensibilities and the conflicts women endure, and simply choose to disregard his often unflattering conclusions as to the roads we take.
To hear Sandra Tsing Loh tell it, abandoning your children is the next great leap of feminism. Actually, women who leave their families merely lower themselves to the level of absent fathers. My ex-wife left our three kids for some guy she met on the Internet three years ago. She quit her job, took her retirement money, left the state, and pays $150 a month in child support. I fail to see what Oprah's Book Club, Anna Karenina, Shirley Valentine, or starched or stuffed shirts have to do with running out on your kids. This ultimate act of narcissistic self-actualization and cowardice does nothing for men or women. It just punishes the children, who did nothing wrong but are left to wonder if they did.
Sandra Tsing Loh replies:
In suggesting that Mark Matousek is not the most natural expert on what mothering feels like, I didn't intend to tread on the rights of any persons, including Matousek, gay parents, or men in general. I merely meant that the most natural experts are mothers themselves.
Now to the Russian language and literature problem, which sadly we do not have a Tolstoy novel's length of space to discuss. Short version: Oprah had always wanted to tackle Anna, feared length. Oprah's eventual conclusion: Long book, big tragedy, sad woman. Anna Karenina—juicy read, hard to see as story of female liberation, because of train. Are we actually in disagreement? Oprah's personal triumph was that she finished the book. Manilow's was his making a new twist on "Copacabana." Not a pretty culture, perhaps, but it's ours.
Finally, to Dave Hippo, I am truly sorry. What a tragedy. That is perhaps one thing on which an unashamed gay man, a baffled Tolstoy scholar, and I could all agree.
I take issue with the suggestions embedded in the articles by Stuart Taylor Jr. ("Remote Control," September Atlantic) and Benjamin Wittes ("Without Precedent," September Atlantic) that term limits be considered for members of the Supreme Court. Alexander Hamilton would certainly object to a rotating bench, if the goal is to remedy the lack of experience on the Court. What's more, giving greater control to the president and the Senate would only work to undermine judicial independence.
A political problem of this nature requires deference to the Constitution, not amendments. And if ambiguity in the law is the problem, perhaps we ought to look to legislatures, which seem to eschew concrete lawmaking whenever possible.
The influence of the Court has expanded, to be sure, but its growth has hardly matched that of the other branches. The greatest problem with the current Court is its inability to do its job, which is protecting the rights of citizens.
Steven MichelsAssistant Professor of Political Science
Sacred Heart University
As a law teacher for twenty-seven years and an occasional appellate litigator, I share Benjamin Wittes's view that it is difficult to teach the holdings in particular cases when the courts so often fail to honestly and objectively describe the facts and apply precedent. According to Justice Antonin Scalia, for example, Justice Thomas does not even believe in stare decisis, or the obligation to follow precedent. Bush v. Gore, perhaps the most prominent of the Supreme Court's recent opinions, was judicial activism in its purest form.
But the real problem is not the Supreme Court, where it is unlikely that a misleading description of law or facts will escape scrutiny from a dissenting justice. The real problem is in the federal appellate courts, where judicial appointees have been ideologically litmus-tested by conservatives and in some cases appear to regard their roles as lieutenants in an ongoing political and cultural war.
Wittes recommends that judges on the lower federal courts publicly criticize Supreme Court holdings. Just what we need—increased activism, division, and judicial anarchy. I can't defend the current Court with great enthusiasm—and Stuart Taylor Jr.'s preceding article on the relative experiential cluelessness of the justices seems right on the mark—but the subheading accusing the Court of "arrogance, dishonesty, grandiosity, and a lack of respect for principle, history, or logic" is plainly more applicable to the current leaders of the executive and legislative branches.
James F. Ponsoldt Professor of Law
University of Georgia
Stuart Taylor Jr. is incorrect that only one justice, David Souter, has presided over a trial. The late chief justice William Rehnquist asked to be assigned to a trial, and presided over it, a couple of years ago—expressly because he wanted to learn more about trial practice.
Los Angeles, Calif.
In "Without Precedent," Benjamin Wittes makes a number of valid points about the usurpations that are routinely passed off as legitimate decisions by the U.S. Supreme Court. However, the idea that all members of the Court are guilty of sloppy, result-driven jurisprudence is wrong. As practitioners of the doctrine of original understanding, Justices Antonin Scalia and Clarence Thomas vigorously dissent from the make-it-up-as-you-go-along decisions that display the problems Wittes ably identifies. For example, Scalia's dissent in the 1992 abortion case Planned Parenthood v. Casey is a brilliant critique of this judicial arrogance.
Moreover, Judge Richard Posner's criticism that none of the current justices is "a John Marshall, Oliver Wendell Holmes, Louis Brandeis, or Robert Jackson in depth of insight or … breadth of experience" fails to recognize that Scalia surpasses the overrated Holmes and the very overrated Brandeis in commitment to principle, analytical rigor, and rhetorical skill.
The Court is a corrupting force in our government, but not all the justices contribute to this arrangement.
Gregory J. Sullivan
Stuart Taylor Jr. replies:
Dilan Esper is correct, except it was in 1984, before Justice Rehnquist became chief justice, that he slipped quietly out of Washington to preside over a two-day civil-rights trial in Richmond, Virginia. At his own initiative, Rehnquist sat specially as a U.S. District Court judge, a common practice in the early days of the Republic which had become virtually unheard-of in the twentieth century. I also erred in overlooking the fact that Rehnquist and Justice Anthony Kennedy did some criminal-defense work when they were in private practice, more than thirty years ago. I regret these errors.
W hile I admire and applaud Lori Gottlieb and other members of Single Mothers by Choice ("The XY Files," September Atlantic), I had to smile when I read about the hand-wringing that accompanied their decisions about which sperm donors should father their children. After I became pregnant (the old-fashioned way) with our first child, and my husband and I began to examine in earnest the gene pool on both sides of our family, we looked at each other and said, "Yikes!" Today we have two wonderful, healthy children, and although they are like my husband and me in some ways, in most ways they are like no one but themselves. Sometimes we can see where a characteristic came from ("Just like Aunt Doris!"); at other times its source is a mystery. Whenever one of our children displays a trait that baffles us, my husband mutters to me, "Not exactly what I ordered." I have to believe the same will be true for Gottlieb and the other members of SMC, no matter how carefully they select their children's fathers.
Lori Gottlieb claims that she and her like-minded friends were "idealistic." I think "goal-oriented" is the more honest description of women who hope to find a laundry list of items in "half a cubic centimeter of defrosted sperm." All through the article one finds rationalizations for what is essentially a selfish, shortsighted act.
I, too, am a single mother, although I confess to having taken a more starry-eyed approach to conception: I knew the father for some time beforehand. I freely admit that all purposeful childbearing is at least partially a selfish thing; no parent of either sex can pretend that there isn't some ego involved. But think of the already considerable risks and sorrows that inevitably affect our children at least as much as they affect us. To add to them by choosing a gene partner in this way is just plain irresponsible.
I was fascinated by Lori Gottlieb's story: by the practicality of her reasoning (buy discount toilet paper but not discount sperm), by the effort to make something romantic out of a medical procedure, and by her rosy vision of a future when, responsible for a child whose father she knows only as a number, she will meet—and marry—her soul mate.
And then I got angry, for surely we have lost our way.
Of course it's good to have options. Not so long ago women, even in the Western world, were severely restricted in their actions and activities, vilified if they had a child "out of wedlock," often rushed into marriage with an unwilling partner. Marriage was for life, divorce being rare, difficult, and expensive. A "divorcée" was not accepted in polite circles.
But too much choice presents us with a paradox. The more choices we have, the less satisfied we become. Nowadays if our marriage doesn't turn out exactly as we'd hoped, we can divorce and try again with someone else—perhaps someone who can "fathom [our] soul." And if we don't want to be tied to just one partner, we can have serial lovers, cohabitation now being socially acceptable. And single motherhood is "a choice"—at least for those women with incomes sufficient to support a family and all the "caretakers" they must employ if they are to maintain their position as breadwinners.
One must ask, too, if men's role as marriage partners and fathers is gradually being eroded. In my research as a sociologist I speak with many women who areso involved with their careers that they don't have time for a serious, committed relationship, or who have such unrealistic expectations of marriage that no man could meet them. As a woman who likes men, I can't help thinking they are being reduced to "side dishes"—tasty additions to life, but not the real meal. Side dishes and suppliers of sperm.
Monica B. Morris
Los Angeles, Calif.
Speaking as a father in my mid-thirties, I must take issue with Lori Gottlieb. Her entire article is based on the erroneous assumption that it is okay for a child to grow up without a father—that men are somehow optional in the rearing of children. The majority of social opinion appears to be to the contrary, as evidenced by the amount of time and effort state governments have spent enacting "deadbeat dad" laws in an attempt to force men to take more responsibility for their offspring.
Furthermore, Gottlieb's inability to find Mr. Right is difficult for me to believe. I know countless men who are both kind and loving husbands and kind and loving fathers. Perhaps it is not that the men around her are lacking but, rather, that her own huge ego keeps getting in the way. Gottlieb does an excellent job of defining an entire subculture of women in America today who are so self-important that no man will ever be good enough for them. If they cannot set their egos aside long enough to settle down and get married, how do they expect to raise a child—arguably the most selfless act most adults undertake—by themselves?
And while Gottlieb worries about what to say to her child if she has a daughter, what of all the baby boys conceived with donor sperm? What do their mothers tell them? Sadly, the message being handed down to this next generation of sperm donors is that fatherhood is little more than a biological act without social or emotional consequence.
Matthew D. Taylor
Lori Gottlieb's article was a terrific read. As a married father of two, I offer some free advice. If Gottlieb thinks it's "bad form" to mention her donor's identification number, wait until she finds a good and dependable babysitter. If someone asks her for that individual's name, mentioning it will be really bad form.
W. Charles Bailey Jr.
Lori Gottlieb replies:
I agree with Kathryn Dailey. Now that I'm pregnant, I realize how useless the process of shopping for specific traits was. Besides, even if my baby emerges with all the recessive traits I never thought I (or my donor) carried, I'm already too in love with him or her to care.
It's surprising that Eileen Fay, a single mother by choice, takes issue with women who make the same choice but use a different method of conception. There is simply no evidence that a child is better off if the donor is known rather than anonymous (in fact, in many cases children are heartbroken by the abandonment of a known donor). I wonder if she would similarly label "irresponsible" infertile couples who use an anonymous sperm donor instead of a family friend's sperm in order to have a family.
I am confused by Monica Morris's implication that women like me are reducing men to "suppliers of sperm." On the contrary, we love and respect men enough to treat them as individuals. If we thought all men were interchangeable for this purpose, we would have simply married one.
I don't believe, as Matthew Taylor contends, that I'm sending my child the message that fatherhood is irrelevant. The message of my story is that life involves tough compromises, and this seemed to be the best option for both me and my child. Given that nearly a third of all families in this country are headed by single parents, does Taylor believe that people who divorce are selfish and egotistical too? That raising children in a tension-filled household or shuttling them back and forth between homes is better than raising them as a single, stable, loving biological mother? I'd suggest to Taylor that having a child alone requires a woman to let go of "self-absorption" and to set aside her "ego."
Does Benjamin Schwarz seriously believe that Fred MacMurray's performance in Double Indemnity (1944) is "one of the best in the history of American film" (Editor's Choice, September Atlantic)? After many years of watching movies and of reading criticism and film commentary, I'd say that's a new one on me. While most observers rate Double Indemnity as a career highlight for MacMurray (as he did during his lifetime), most of the praise for acting in that film has over the years (rightly) gone to Barbara Stanwyck. It looks to me as though Schwarz has just founded a fan club of one.
In his review of Eamon Duffy's The Stripping of the Altars (Editor's Choice, October Atlantic), Benjamin Schwarz expresses surprise that Duffy did not cite a line in Shakespeare's Sonnet 73
In my reading, this line is all a part of the trope of the poem: the speaker comparing his old age to nature in winter. "Choirs" is so obviously in apposition with "boughs" in the line above ("Upon those boughs which shake against the cold") that I wonder how anyone could think to take it otherwise than "I am now an old man who not so very long ago was much like a blossoming tree in whose boughs birds warbled sweetly." This is a love sonnet, not a historical/political play.
Benjamin Schwarz replies:
Well, Mr. Green, that's why God made chocolate and vanilla. In his performance MacMurray brilliantly marries menace and affability. He's created a truly decent sociopath, and his is among the smoothest and most natural performances in American cinema. Stanwyck did a splendid job, of course, but she was playing her favorite and most familiar role here: the tough, sexy dame. I prefer her in Preston Sturges's The Lady Eve.
As for Daniel Myers, he seems to have a one-dimensional view of Shakespeare's sonnet, and he seems to be a somewhat simplistic reader. The sonnet is a love poem, but it also resonates with political and historical meaning. Nearly every Shakespearean scholar has read the line as, among other things, a reference to the dissolution of the monasteries. It's also, of course, a comparison of the speaker's old age to nature in winter. For a more nuanced way to read this sonnet, and literature in general, Myers should take a cue from this passage in William Empson's classic Seven Types of Ambiguity:
The fundamental situation, whether it deserves to be called ambiguous or not, is that a word or a grammatical structure is effective in several ways at once. To take a famous example, there is no pun, double syntax, or dubiety of feeling in Bare ruined choirs, where late the sweet birds sang, but the comparison holds for many reasons; because ruined monastery choirs are places in which to sing, because they involve sitting in a row, because they are made of wood, are carved into knots and so forth, because they used to be surrounded by a sheltering building crystallised out of the likeness of a forest, and coloured with stained glass and painting like flowers and leaves, because they are now abandoned by all but the grey walls coloured like the skies of winter, because the cold and Narcissistic charm suggested by choir-boys suits well with Shakespeare's feeling for the object of the Sonnets, and for various sociological and historical reasons (the protestant destruction of monasteries; fear of puritanism), which it would be hard now to trace out in their proportions; these reasons, and many more relating the simile to its place in the Sonnet, must all combine to give the line its beauty, and there is a sort of ambiguity in not knowing which of them to hold most clearly in mind. Clearly this is involved in all such richness and heightening of effect, and the machinations of ambiguity are among the very roots of poetry.
Two decades have passed since Rick Moody spent his years in Columbia's M.F.A. program being miserable ("Writers and Mentors," Fiction Issue 2005). He might like it better now. The program is still big, yes, but many of us view its size as a strength. Columbia fiction students represent a wide range of style, subject matter, and thematic occupation unavailable at smaller programs. Moreover, the program is no longer internally competitive the way Moody describes, allowing us to lean on those individual voices for reactions that extend our understanding of both our readers and our own work. The faculty as a whole subscribes to no specific aesthetic, the range of teachers' interests reflecting the diversity of their students.
Moody claims that the ethics of the corporate-governance model, applied at the "topmost levels" of Columbia, "must certainly trickle down into individual departments" and affect how writing workshops are run. He strongly implies that as a result, current writing students are stuck asking predictable questions in order to streamline results and ensure quality control. He is simply wrong. Workshop atmosphere is dictated not by the favorite management model of Columbia's president, Lee Bollinger, but by the instructor and the workshop participants.
Jae Won ChungColumbia University
New York, N.Y.
Many of the people who dismiss graduate programs as inessential, perhaps destructive, are also writers who have graduated from the very top M.F.A. programs themselves. Maybe the workshop
Matthew GriffinColumbia University
New York, N.Y.
W orkshops are not about "photocopying stories, handing them out, collecting responses, handing back the responses." They are about arduous and serious labor, about reading and writing and critical thinking and challenging ourselves on all fronts. There is nothing corporate about the workshop; it is democratic in design. It is not creative writing by committee. To suggest so demeans the intelligence of student writers.
Workshops are untidy and often unpredictable and as complex as the manuscripts, students, and leaders who tenant them. The houses are not all gray. It disheartens me to read the same-old, same-old complaint from Rick Moody, because I admire the risks that he takes in his own work.
Joan ConnorDirector, Creative Writing
Caroline Elkins ("The Wrong Lesson," July/August Atlantic) worries that our war on terror is to some extent being guided by "a flawed historical analogy" drawn from British experiences in counterinsurgency, courting the risk that we may replicate repressive imperial practices. As the one who injected the pseudo-gang analogy into our strategic bloodstream, I can only say that I share her concerns. Indeed, it is because I worry about our over-reacting to the global intifada that I have advanced a number of smaller-scale alternatives, including the use of pseudo-gang surrogates. I find it far more sensible to fight dispersed terror networks with roving networks of our own than to rely on "shock and awe" aerial bombardment and traditional military occupation, the methods that have characterized too many of our efforts over the past few years.
We'll do better with this nimbler, more networked approach. And we can adopt it without embracing the coarse conduct that accompanied Frank Kitson's mad Arthurian quest to hold together a dying empire. For we are not trying to hang on to colonies; we are tracking a network operating in more than sixty countries. We want members of terrorist cells and their potential recruits to fear making contact with others, and then to be duped by our own cells when they finally risk reaching out. There is no need for a reign of terror anywhere as a prerequisite for undertaking such a campaign.
Perhaps the point is that history does not lock us into a predestined route. When Santayana spoke of learning the lessons of history, I'm sure he had in mind identifying what has enduring value and avoiding a repeat of the calamitous. Elkins admits that the pseudo-gang idea could be effective, but she believes we will adopt a host of terrible practices along with it. Though I thank her for sending up this flare, she's a little late. Between the mistreatment of detainees and the heavy noncombatant casualties in Iraq, we are already in a parlous ethical state. Carefully adopting innovative approaches is the only way to get out of the hole we have dug for ourselves.
John ArquillaProfessor of Defense Analysis
U.S. Naval Postgraduate School
Caroline Elkins misrepresents both current American strategy and the historical record of British counterinsurgency. Her claim that American practice is converging on the British model is largely false, and her polemical treatment of British counterinsurgent policy is misleading. If one accepts the need to restore civil order in Iraq, then the appropriate question is how best to accomplish the task. British counterinsurgent practice, though not without its faults, can improve the efficacy and precision of U.S. strategy in Iraq. The skillful use of defectors is central to victory in counterinsurgency. Defection removes enemy fighters without the damage associated with offensive operations and provides invaluable insights into insurgent organizations. Compared with the current series of raids and sweeps along the Syrian border, pseudo-gangs composed of such defectors are a more viable and less destructive means of exploiting the fissures in the Sunni resistance.
If Elkins's grasp of current U.S. strategy is tenuous, then her attempts to link British counterinsurgent policies to post-colonial misbehavior are weaker still. No serious student of contemporary Malaysian politics attributes Muhammad Mahatir's imprisonment of his deputy prime minister to British colonial or counterinsurgent policy, and no informed person would equate such actions with the sweeping violations of human rights in North Korea, Belarus, and Zimbabwe. Elkins's attempt to shift the blame from contemporary local elites to the former colonial powers is deeply unfortunate; these arguments support those who would rather divert attention from atrocious governance than face the need for genuine reform. When scholars at first-rate institutions can survey the current scene and conclude that "the real 'outposts of tyranny' are the institutions left behind by the colonial and military strategists in Britain's twentieth-century empire," then we have cause to worry that academia is no longer an outpost of reason.
Colin Jackson and Austin LongMassachusetts Institute of Technology
Caroline Elkins asks, "Are pseudo-gangs really the best model for the United States in its global war on terror?" Yes, they might be—but even if they are not, they are a tried and true method of counterinsurgency that has been employed with great success not only by General Sir Frank Kitson but also by Los Pepes, the U.S.-sponsored paramilitaries fighting Pablo Escobar in Colombia. The lessons, insights, and observations of Kitson and others form the foundation for almost every successful counterinsurgency campaign worldwide, and are as pertinent today as they were fifty years ago. Furthermore, trying to impose twenty-first-century standards of morality on Kitson is akin to demonizing Jefferson and Washington as slave-holders even though they helped write the Declaration of Independence and win the Revolutionary War. As a Marine who has served in Iraq, I believe pseudo-gangs would work in a limited application that directly targeted those who aid the insurgents. However, these organizations often grow as powerful as the lawful armed forces, and thus become inclined to compete for power, authority, and legitimacy. Once that happens, they are difficult to disband. Given that nothing about the insurgency in Iraq is radically innovative, we should not keep looking to new concepts for the answer. We continue to disregard successful counterinsurgency campaigns undertaken in the Philippines and El Salvador, and also lessons learned from Southeast Asia and the Banana Wars. Elkins and others should embrace and spread the word about Kitson, not vilify him—or we as a nation may be forced to painfully relearn lessons he articulated thirty-four years ago.
Major Adam Strickland
Caroline Elkins replies:
To varying degrees each of these letters endorses the view that desperate times call for desperate measures. But neither the United States nor any other power can adopt counterinsurgency techniques without taking into account the context in which they were first used in the twentieth century, and the context in which they are being used today.
At issue is the line between legitimate and illegitimate means. If history teaches us anything, it is that this line must be closely policed in times of war, particularly when the maneuvers are being forged in political circumstances that have such strong parallels to those of Kitson's time. Although I applaud John Arquilla's concerns, I would also suggest that the warning "flare" is not too late. Current wisdom suggests that the resistance could last at least another ten years. The insurgencies in such former British colonies as Kenya and Malaya stretched for nearly a decade or more. Indeed, Britain's counterinsurgency tactics eventually brought those wars to an end, though at continuing cost to the regions in question.
The Calendar section of the September Atlantic describes Barry Bonds as "clos[ing] in on Babe Ruth's home-run record." I think that a fellow named Henry Aaron has held this record for the past few years.
Matthew Quirk replies:
He certainly has. Although Bonds, with the third highest number of career home runs (703), is closing in on No. 2 Babe Ruth's long-standing record of 714, he is still a way off from Hank Aaron's all-time record of 755.
In his article "Meltdown: A Case Study" (July/August Atlantic), Benjamin M. Friedman suggests that "the new Statue of Liberty (completed in 1886) … proclaimed America's welcome to the world's 'huddled masses' and 'wretched refuse.'" In fact, at first the statue did not proclaim welcome to anyone. It was given to the citizens of the United States by French citizens in honor of our liberty and the hope that such liberty would find its way into other countries. The statue didn't extend "America's welcome" till 1903, when the last five lines of the sonnet "The New Colossus" were inscribed on its pedestal.
Palm Desert, Calif.
One aspect of John Gardner's philosophy that Mary Gordon ("Moral Fiction," Fiction Issue 2005) does not touch on directly is his belief in the writing process as a kind of sacred cauldron for cooking down characters and creeds to their most elemental forms. A work can be moral, Gardner says, if it deals with any subject—pleasant or otherwise—in a way that carefully considers all sides. (Implicit in this theory is that "good" will always "win" if one gets the process right, a leap of faith that taxes my optimism.)
This notion of the inherent morality of the "true" writing process is connected to Gardner's belief, as a teacher and critic, that fiction should be a "vivid and continuous dream," immersing the reader in its world from the first sentence of a story or book. Anything that calls attention to the fact that you are reading a made-up story is immoral—even though Gardner himself dabbled in such tricks in October Light and other books.
Gardner's pronouncements on moral fiction—in his criticism, interviews, and even fiction—are varied and sometimes contradictory. On Moral Fiction itself nearly torpedoed his career, and the subject matter seemed in some ways to bring out the worst in him. Yet his passion for the writing process, and his belief in the ability of art to be inspiring rather than just entertaining, are infectious and, at some level, central to the reasons why any creative person creates.
Scotch Plains, N.J.
Terry Castle writes in "Gender Bending, Pt. 1" (September Atlantic) that she has yet to meet a female contemporary who has read Seven Pillars of Wisdom. Though I am not sure I can be considered a contemporary, having recently qualified for Medicare, Castle may be interested to know that I read this book some forty years ago, when my husband worked as a logging engineer for a well-known oil-service company in Egypt. Our environment provided few distractions apart from a small daughter born in Cairo, so when I came across the book I indeed read it from cover to cover.
Let me point out two factual errors in Christopher Hitchens's review of Salman Rushdie's Shalimar the Clown ("Hobbes in the Himalayas," September Atlantic). First: the correct name of the Indian epic he alludes to is the Ramayana, not Ram Leela. Second: the Bhagavad Gita, which Hitchens calls an Indian epic, is not one. It is a philosophical poem, in Sanskrit, in 700 verses, by an anonymous poet.
K. N. KuttyProfessor Emeritus, English
Eastern Connecticut State University