In his reply to Alecia Flores, who chided him for taking a gratuitous potshot at President Kennedy (Letters, July/August Atlantic), Christopher Hitchens—apparently riding his one-trick pony into the ground—takes yet more potshots at a man who can’t fire back!
For instance, he refers to JFK as “the boy president”—a potshot that also has the virtue of being incomprehensible, since Kennedy was forty-three when he was elected president and had already been a congressman, a father, and a combat veteran. How much, one wonders, had Mr. Hitchens done by that age? How would he have liked being called a boy?
And then there’s that old favorite: the smirking insinuation that Ted Sorensen, not JFK, wrote Profiles in Courage. We have mountains of sworn statements; we have eyewitness testimony to the actual act of writing (JFK was in the hospital during part of the book’s composition, and, predictably, he was a great favorite of the nurses); we have handwritten and annotated drafts of every single page of the book. But does any of that matter? No! Not when a nod and a wink can pass for knowledge of a subject.
And how many witnesses, how many long-suffering dinner guests trapped in mini-lectures by the commander in chief, how many subsequent references and remarks made by JFK himself, does Mr. Hitchens need before he can refrain from implying that Kennedy wouldn’t have read such books as Cecil’s Melbourne or Agar’s Price of Union?
Finally, Mr. Hitchens gets his dramatis personae wrong. JFK wouldn’t have been “the Galahad of Camelot"—of course, he’d have been the king. One wonders how much patience he’d have had with bargain-basement Mordreds.
The travesty of justice undergone by Douglas Preston and Mario Spezi (“The Monster of Florence,” July/August Atlantic) is the tip of the iceberg. The Italian judiciary (which includes the public prosecutors) is a branch of the civil service. This particular branch chooses its members, is self-ruling, and is accountable to no one: a state within the state! This body of bureaucrats can be roughly divided into three sections: a large minority, corrupt and affiliated to the former Communist Party; a large section of honest people who are too frightened to stand up to the political minority (which controls the offices of the judiciary); and a minority of brave and honest men with little influence. Political and dishonest judges have an infallible method of silencing or discrediting opponents, political or otherwise. A bogus indictment, the tapping of telephones, the conversations (often doctored) fed to the press to start a smear campaign, a spectacular arrest, prolonged preventive detention under the worst possible conditions, third-degree interrogations, and finally a trial that lasts many years and ends in the acquittal of a ruined man. Spezi was lucky, because the powerful Florentine public prosecutor is no friend of the Perugia prosecutor’s and, I am told, “suggested” that Spezi be freed; the Perugia court, I am told, accepted the “suggestion.”
Count Neri Capponi
I somehow missed the connection between Stuart Taylor Jr. and Benjamin Wittes’s professed concerns about the Supreme Court (“Of Clerks and Perks,” July/August Atlantic)—the literary projects of various members, the perceived lack of civility, the justices’ choices about which cases they hear—and their proposed solution of firing the Court’s law clerks. Their actual agenda seems to be to make the job of a Supreme Court justice significantly less pleasant. (Note the nostalgic reference to times when being a justice was “a grind.”) Why don’t we also evict the justices from their offices, and expect them to write their opinions at Starbucks?
As Taylor and Wittes must know, it is customary in almost all law firms for a junior lawyer to draft a document and a senior lawyer to review what the junior lawyer has prepared. It’s not clear to me why judges should operate differently. Now, a good senior lawyer may well need to make significant changes to a junior lawyer’s draft; it appears from their summary, though, that this already happens with the current members of the Court.
J. Thomas Oldham
John Freeman Professor of Law
University of Houston Law School
In their Swiftian proposal to make Supreme Court justices work more by firing their law clerks, Stuart Taylor Jr. and Benjamin Wittes confuse caseload and workload. The Court’s 160 “full decisions” in 1945 were published in a single volume that was about 1,300 pages long. The last term’s sixty-nine signed opinions will be published in three volumes totaling nearly 4,000 pages.
The reason the justices “don’t have to hear any case they don’t wish to” is that the late Chief Justice Rehnquist was determined to focus on the cases of most lasting consequence, and Congress obliged by eliminating most of the Court’s mandatory jurisdiction. Able to shape the field of its pronouncements, the Rehnquist Court ruled more by ruling less. The opinions became mini-treatises, and the basic legal research had to be perfect. Hence, the rise of the law clerk.
In “Shock Absorption” (June Atlantic), Clive Crook assures us that the world isn’t running out of oil. But the world has been running out of oil from the moment we began using it. And the question isn’t when we will run out, but when production will peak and begin to decline. That’s when our growing population will collide with shrinking energy availability—and many oil experts believe that oil production is at or very near its peak today. Crook’s assurance that we have enough oil to last for decades makes it clear that he doesn’t understand the problem. A February 2005 Department of Energy study found that we would need to start a crash program twenty years before global oil production peaks in order to avoid severe economic disruption.
Carl R. Henn
Clive Crook claims that we have both plenty of oil and also decades in which to manage the threat of global-warming climate change, but he’s wrong on both counts. Some of those “proven” oil reserves that Crook asks us to rely on are real; others are just figures pumped up to make oil companies and governments look good; still others are in places so remote or so politically unstable that they should hardly be counted at all. The cheap and easy oil—the light, sweet crude that gushes from wells under its own pressure—is a thing of the past. What’s left is ever harder and more costly to extract, and ever lower in quality.
As for global warming, with carbon dioxide at 385 atmospheric parts per million—far higher than at any other time in the history of civilization—it’s already well on its way toward changing sea levels and patterns of rainfall. We don’t have the leisure to leave it to be dealt with over “an extended span of time.”
David A. Korn
Alissa Quart solicited a variety of professional views about the flexibility of developmental windows (“Extreme Parenting,” July/August Atlantic), but these opinions are countered by the evidence—noted in my book, Learning Before Birth: Every Child Deserves Giftedness—that the brain’s postnatal plasticity is dwarfed by the cellular proliferation and massive die-off that occurs in the womb. Two months after conception, a genetic program triggers the manufacture of brain cells, about 250,000 per minute for ten weeks, with only 10 percent from this production surviving to gestation’s end.
How, then, to substantially and safely upgrade human potential? Research shows that virtually all sounds in the maternal environment (television, radio, conversation) quite audibly reach the unborn child. Yet the loudest sound—cresting at ninety-five decibels, loud as a rock band—is the noise of the mother’s blood constantly pulsing past the placenta. Although the fetus spends much of its time asleep, brain-wave activity demonstrates unconscious awareness of this booming stimulus, probably an ancient surveillance system. Moreover, the mother’s heartbeat creates an imprint (a simple repetitious pattern that becomes cortically ingrained), through a form of involuntary learning that concludes at the end of a full-term pregnancy.
In 1982, I discovered how to record this in utero blood pulse, then electronically digitize this natural sound at subtly faster rates, sequentially introducing slight tonal shifts—in effect, a cardiac curriculum—that, when played back to the baby a few hours every day, would exercise its forming mental circuitry. The point was to imprint memory with recognizable variants (not sophisticated music or verbal content, both of which are white noise to a fetus) so that when the moment of cataclysmic cell death arrived, less pruning would take place, because a more mature brain had begun doing what humans do best: playing with age-appropriate patterns.
As complexities mount everywhere, no topic is more pressing than how our outmoded abilities and behaviors can be improved. Extreme parenting—at the right time—is essential under extreme conditions.
Director, Prenatal Institute
Alissa Quart’s characterization of Maria Montessori’s insights into how children learn was misinformed. The idea that “pupils would learn willingly if their schoolwork were more like play” is in fact antithetical to Maria Montessori’s original observations and intentions. The Montessori method is founded on the premise that children thrive when learning practical skills and using genuine tools. Montessori discarded toys from her Casa de Bambini because she observed that the children she had gathered from the slums of Rome preferred to work, rather than “play.” As a result, toys are conspicuously absent in any Montessori classroom, and children are engaged in practical tasks such as cleaning, sewing, cooking, and organizing, as well as tasks that involve hands-on approaches to mathematics and language.
Carrie Frederick Frost
Outgoing Director, East River Montessori
Bluefield, W. Va.
In response to Alissa Quart’s article, I wish to clarify the position of Parents’ Action for Children regarding the needs of young children, and to describe the focus of our work today. We share Ms. Quart’s concerns about the recent vogue of “extreme parenting,” but urge readers not to throw the baby out with the bathwater. Contemporary research overwhelmingly confirms our view that “the first years last forever”—not in the sense that developmental windows slam shut according to rigid timetables, but in the sense that parents do have unique opportunities during these years to build strong emotional, cognitive, and social foundations for their children’s entire lives. And we regret that growing awareness of these opportunities is being aggressively and often misleadingly exploited by the “Baby Genius Edutainment Complex.” Far from encouraging the industry’s overhyped quick fixes, Parents’ Action recommends the kind of intentional day-to-day, one-on-one interactions—such as talking, reading, singing, and playing—that are key to parental bonding and early development for infants and toddlers.
President and CEO
Parents’ Action for Children
Three cheers (or is it four?) for Matthew Stewart’s article (“The Management Myth,” June Atlantic), which not only points out that the management emperor is naked but also accurately describes the con artists who sell the marvelous management suits in which all of our finest enterprises clothe themselves—suits that only we poor commonsense fools cannot see.
But why does Mr. Stewart suggest that we substitute academic philosophers for M.B.A.s—that is, one set of charlatan tailors for another? Successful business is essentially the pragmatic art of survival. It is about challenge and competition, prospering in the face of ever-changing obstacles and threats. Might it not be better for our business leaders to study the campaigns of Brasidas, Alexander, and Hannibal, rather than the musings of Plato, Aristotle, and Lucretius? Might they not be better prepared if they studied Clausewitz and Sun Tzu, rather than Hegel and Confucius?
C. Kerry Nemovicher
While I understand Matthew Stewart’s doubts about the value that management consultants deliver, his argument confuses popular books on management with what one learns in a graduate school of business. Most books on business are indeed fad driven, imprecise, poorly argued, and unscientific, but few of them actually appear in a good business-school curriculum. Instead, business-school classes are filled with far drier and more useful lessons on economics, accounting, operations management, and finance. Mr. Stewart noted that in his reading, “not once did I catch myself thinking, Damn! If only I had known this sooner!" I doubt that Mr. Stewart was reading books that form the backbone of business-school syllabi, such as Competition Demystified, by Bruce Greenwald and Judd Kahn; Analysis for Financial Management, by Robert Higgins; and Inventory Management and Production Planning and Scheduling, by Silver, Pyke, and Peterson. If he had, he might have enjoyed more of those moments of insight.
Overall, Stewart seems to disdain the study of business because it has neither the scientific certainty of many other pursuits nor the rigorous logic of philosophy. But such a distinction applies equally to all other social sciences, which deal with some of the weightiest questions humanity faces. To give up on these fields because their answers do not fit into neat frameworks seems foolhardy.
New York, N.Y.
I am writing to offer a correction and a clarification to “The Management Myth.” First, the correction: the article states that in 1908, Harvard “opened the first graduate school in the country to offer a master’s degree in business”; actually, the Tuck School of Business at Dartmouth was founded eight years earlier, in 1900, as the country’s first graduate school of management.
More important, a clarification: Mr. Stewart himself fell for a myth about the modern M.B.A., which is that it can be equated to a survey of today’s popular best sellers. There is a strong element of truth to his central thesis, that Rousseau and Shakespeare offer a better grounding for business management than a library of the gurus’ tomes. Many of our best students here at Tuck come to us with a broad liberal-arts education, and successful business leaders are as likely to have undergraduate degrees in poetry or fine arts as in finance or economics. But the popular management best sellers are only a small sliver of a true business education. An M.B.A. from a top business school gives you a solid foundation in quantitative fields like accounting and finance; it gives you the terminology, the underlying theories, and the best practices; and it allows you to expand your understanding of how that all works in actual business settings. It also explores the “soft skills” that are so important, from ethics to leadership to communication.
If Mr. Stewart is interested in sampling the full spectrum of a modern M.B.A., we’d love to have him up here at Tuck and do our best to kill some myths together.
Dean Tuck School of Business at Dartmouth
Matthew Stewart replies:
On the question of fact: the Tuck School of Business at Dartmouth was indeed founded first, as Paul Danos says; but Harvard was first to offer a master’s degree in business administration.
I am grateful to Mr. Danos and John O’Brien for pointing out what I should have made clearer in my article: that M.B.A. programs and the management best-seller lists are two different things. I treated both in one article, not because I think they are the same but because they both draw from the same origins in the history of management thought and contribute (in distinct ways) to the perpetuation of the management myth.
At the same time, I fear that Mr. Danos is overly optimistic when he suggests that business education is largely free from the taint of the best-seller lists. In the section of my article in which I discussed M.B.A.s, I did name one “guru,” Michael Porter—a professor at the Harvard Business School—and I’m sure that a refined analysis of the popular literature would make room for the special class of “tenured gurus.” I would also like to alert Mr. Danos to the fact that some decidedly untenured gurus have been worming their way onto the curricula, as business schools bow to the demands of their customers—I mean, students—for the latest goods.
Nonetheless, ever the optimist, I support scholarly, empirically grounded, and well-written research into business, and I even hold out hope that it might one day elbow aside some of the pulp on the best-seller lists. I’m sure Mr. O’Brien will be relieved to hear that he and I are both with Aristotle on this one: in the study of any particular subject, we should seek the level of precision of which it admits, not more or less. But, as Aristotle (or, for that matter, any academic) might add, conducting research and preparing students for business careers are two very different things. At the end of this particular debate, I foresee a case for more business Ph.D.s, not more M.B.A.s.
Finally, I’d like to assure Mr. Danos, his fellow deans, and other readers that I do not advocate the wanton demolition of business schools. In fact, I thought I was being rather generous in my article when I pointed out that M.B.A. programs deliver a certain amount of useful, specialized knowledge to their students; that the best programs make an effort to impart relevant skills; and that most graduates I know are, on balance, positive about the experience. My concern is this: once you take away the benefits that are extrinsic to the content of the education (e.g., the social and cultural capital and the placement services), there is little that an M.B.A. degree can do that an advanced degree in any number of other fields can’t. This doesn’t mean that prospective students should tear up their B-school applications (those extrinsic benefits, for one thing, are pretty hard to replicate); but it does say something about the nature of management education, and I hope it will embolden recruiters and job seekers to, er, think outside the box.
Mark Bowden states (“The Desert One Debacle,” May Atlantic) that a group of soldiers from the 75th Ranger Regiment out of Fort Benning, Georgia, participated in the Desert One mission. In fact, the Rangers that participated in Operation Eagle Claw were from Company C, 1st Battalion (Ranger) 75th Infantry, stationed at Hunter Army Airfield in Savannah, Georgia. The 75th Ranger Regiment was not formed until 1984, when the 3rd Ranger Battalion and Regimental Headquarters were activated.
David H. Lewis
Mark Bowden replies:
I regret the mistake, and I’m grateful for the correction. It will be fixed in my book.
Your item regarding the occurrence of mental illness in former U.S. presidents (“Primary Sources,” June Atlantic) encourages readers to accept the popular notion that 50 percent of Americans suffer from one kind of mental illness or another. In particular, diagnosing bipolar disorder in Presidents John Adams and Lyndon Johnson, both known by historians to be men of passion and drama, is a reflection of the current absurdities of the U.S. mental-health-care system. The common feature of all mental illness is not the presence or absence of any particular symptom, but the inability to function normally. Men who have achieved the presidency have conclusively demonstrated their ability to function. That the authors of the report you cite are apparently unable to grasp this fundamental feature of diagnosing mental illness, which is taught to every first-year medical student, is a telling example of how the U.S. mental-health community overreaches in its quest for influence and legitimacy.
Robert Graff, M.D. (psychiatry)
San Leandro, Calif.
I’m surprised that the list of presidents identified as suffering from a “mental illness” makes no mention of Andrew Johnson’s alcoholism. During his tenure as military governor of Tennessee, observers suspected Johnson of absenting himself from his duties for days while on benders. After being elected as Abraham Lincoln’s second-term vice president, Johnson spent the night before the inauguration drinking with friends. Scheduled to make brief remarks to the Senate, he gulped down two huge brandies in front of witnesses, staggered into the Senate, and delivered a rambling and incoherent speech that mortified Lincoln.
Author, Vessels of Rage: The Secret History of Alcoholism
In “The Next Starbucks?” (July/ August Atlantic), Virginia Postrel mistakenly implies that massage developed because of prostitution, that it is only for making people “feel good,” and that our profession adopted the term therapy to suggest “that it’s good for you, which means you don’t have to feel guilty about spending money on it.”
First, massage therapy has nothing to do with prostitution. Prostitutes have co-opted the word massage to hide what they do. Second, while massage does feel good, clinical research shows it also can provide significant relief from stress, aid in recovery of muscle injuries, increase range of motion, and lower heart rate. Some people choose massage to pamper themselves, some use it to improve their athletic performance, some have regular massage to relieve stress, and some follow a physician-prescribed massage regimen to relieve pain or to help them recover from an injury or surgery. All of this is massage, and all of it is beneficial.
The American Massage Therapy Association believes all massage is therapeutic because it contributes to health and well-being. In 1983, our association dropped the ampersand from our name (it was then Massage & Therapy) in order to show that massage is therapy, and that there isn’t some other kind of therapy that massage therapists practice. To suggest that our profession has used the word therapy so people won’t feel guilty about getting massage is ridiculous and insulting to both massage therapists and the people they massage.
Mary Beth Braun
President American Massage Therapy Association
To read Mark Steyn’s Post Mortem piece each month, one might think that women never die. Or worse, that women die but lead lives unworthy of critical reflection and commentary. How about a hint of gender equity?
I know a little about sugars, having taught chemistry for twenty-five years. In “Sweet Tea” (June Atlantic), Corby Kummer discusses different types of sugars found in drinks and foods, specifically fructose and glucose, even describing how they differ in stimulating production of insulin and leptin. Yet he then refers to table sugar as glucose (aka dextrose). It is not: table sugar is sucrose, a dimer of fructose and glucose, which means it must be biochemically metabolized into equal parts fructose and glucose.
I enjoyed Corby Kummer’s article on “Sweet Tea,” but I am confused by his statement that “[o]nly in the 1970s did researchers succeed in converting cornstarch to syrup.” During World War II, when sugar was rationed, we depended on Karo Light Corn Syrup as an all-purpose sweetener in many recipes. It was no different from the bottle that I have in my cupboard this minute. I baked my very first batch of cookies using corn syrup, from a recipe in my Girl Scout magazine. (I was probably eight or nine years old, and my mother couldn’t risk turning me loose with our precious supply of sugar.) Was the process used in 1970 somewhat different from that in the 1940s?
Port Angeles, Wash.
Corby Kummer replies:
Baking (and pecan pies) would be the poorer without Karo Corn Syrup, which is valued for its ability to retain moisture (and thus increase shelf life) and has been made, according to Harold McGee’s On Food and Cooking, since the mid-1800s. The process for converting cornstarch into high-fructose corn syrup, developed by Japanese researchers, is different, and first came into industrial use in the 1970s.
I didn’t refer to table sugar as glucose, but in my review of the differences between fructose and glucose the analogy was indeed implied. Thanks to Mr. Shalit for catching my mistaken implication.
An article in the September 2005 issue of The Atlantic Monthly, “In a Ruined Country: How Yasir Arafat Destroyed Palestine,” by David Samuels, made several references to Mohamed Rachid, a former senior official of the Palestinian Authority (PA) and the Palestine Investment Fund (PIF). Subsequent to publication, Mr. Rachid, who declined repeated requests to be interviewed by Mr. Samuels, contacted the magazine to clarify portions of the article. The references to Mr. Rachid were intended to illustrate certain claims relating to the financial structure and activities of the Palestinian Authority and its late chairman, Yasir Arafat, and not to allege any fraudulent or unlawful conduct on the part of Mr. Rachid. The article did not state nor intend to imply that Mr. Rachid transferred PA or PIF funds to his individual account or used such funds for his personal benefit.