Paul Bloom's excellent piece "Is God an Accident?" (December Atlantic) raises the question of what evolutionary events caused our brains to be hard-wired to embrace the idea of a supernatural, all-powerful being.
Perhaps it wasn't a genetic accident after all. Many years ago my anthropology professor at the University of Colorado offered this explanation, which still seems like a reasonable one: At some point some of our predecessors began to bury their dead to prepare them for what they may have believed was an afterlife. The clans that followed this practice had higher survival rates, owing to improved sanitation, than the clans that left their dead where they fell. This, in turn, served to pass along the genes that promote the notion of a supernatural god, and superstitions as well. Those genes are still with us today.
As Bloom points out, some 95 percent of human beings believe in a supernatural god. It stands to reason that the process of natural selection must have been at work here, just as there are good evolutionary reasons why almost all human beings crave fats and sugars, exhibit jealousy, and so forth. That's why, even in the face of overwhelming evidence to the contrary, most Americans still hang on to the nutty notions of heaven, hell, Adam and Eve, the Great Flood, and the Resurrection.
The existential condition shared by human beings does predispose them to a belief in God. This predisposition, however, is not hard-wired into the human brain, as Paul Bloom's article seems to indicate, but instead emerges through linguistically mediated socialization. It is a product of the human software, so to speak, not the hardware. The distinction is crucial for understanding how the concept of God evolved and why it is ubiquitous.
Notwithstanding the large percentage of persons who believe that their "selves" (or "souls") will exist eternally after death, very few claim that they physically or spiritually existed before their birth. The explanation is quite simple: self-consciousness does not antedate birth, nor does it develop in a vacuum. Rather, we come to know and conceive of ourselves as "selves" by acting out the various roles in the social matrix into which each of us is born.
Although the self-concept is highly abstract, we conceive of our selves not as an aggregation of subjective representations, attitudes, and memories but as ontologically real objects. When we say "I am" or "you are," we impute substantive, self-existent Being to the abstractions "I" and "you." We perform the same trick with the concept of God.
All human beings are at least dimly conscious of the collective intellectual matrix in which their consciousness floats, and which reflects the images of their individual selves back to them. Sociologists, psychologists, and philosophers have referred to this collectively generated intellectual process by various names: the generalized other, the collective consciousness, the collective unconscious, the divine mind, the infinite intellect, the oversoul, and (to use Freud's psychoanalytical metaphor) the internalized superego. The great religions objectify it by such names as God, Yahweh, Allah, and Brahman. Indeed, it was the collective intellectual process on which Homo sapiens embarked eons ago, when the species began using vocally uttered sounds to symbolize externally posited forms, that "created"—or, if one prefers, "evolved"—the verbal structure through which we perceive what we call the universe.
Thus the concept of God is not something distinct from us. It is, rather, a reification of the community of consciousness that we have internalized as a part of us, that we share with other members of the human species, and that ultimately defines who we are. In Jesus' elegant phrase, the kingdom of God is within us. We sense this intuitively, although we quarrel interminably about the sanctity of our Lilliputian visions of the divine totem.
Thomas B. Pryor
Fort Smith, Ark.
Paul Bloom offers an interesting addition to the current literature on the phenomenon of religious faith, but his approach to the subject, like all others I have so far read, is to examine the faithful. Those of us without faith don't get much attention, perhaps because we are so few—little more than a few quotations from prominent figures about ways to reconcile belief and nonbelief.
From a scientific perspective, it might be worthwhile to ask why some lack not only faith but also any impetus toward spirituality of any kind. I grew up in a Christian home, where I went to Sunday school and church, to prayer meetings on Wednesday evenings, to vacation Bible school in the summer, and after about age fifteen to a week of "church camp" each summer. I enjoyed the fellowship and thought Bible study interesting, but it never occurred to me that the supernatural aspects of Christianity might be true. Perhaps it's similar to vaccination against common childhood illnesses—more than 95 percent of those vaccinated achieve some degree of immunity, but a few don't. We have learned something about our immune systems by studying those individuals in whom vaccination doesn't "take"; maybe a look at the few truly areligious among us would aid in understanding faith.
Paul Bloom's lucid account of the infantile origins of religious beliefs such as life after death prompts a question: How do we outgrow the religiously infantile and identify what might constitute "mature" belief? Bloom mentions in that regard Sam Harris's excellent book The End of Faith, which points up the cruelty and inconsistency of much Jewish, Christian, and Islamic dogma, favoring instead Buddhist meditation as the most effective way to calm our universal fears in the face of life's mystery and unfairness. Freud, taking his cue from Marx, interpreted belief as a cushion of rationalization softening the hard facts of death and fate. What Freud omitted, and what science in general omits in order to avoid any dilution of objectivity, is the flip side of fear: awe. Wonder may be the erotic counterforce Freud hoped would outweigh the mingled wish for and fear of death he found deep in the human psyche. One way past Freud's pessimism is to meditate on what science has taught us about the universe, its unfolding phases of creativity, its interconnectedness, and its coming to self-consciousness in our own capacity for awe. Irrational religious beliefs may be planted in us as "a by-product of biological adaptations gone awry," but we are free to evaluate such beliefs in the light of what we have glimpsed, say, through the Hubble telescope. Someone asked Thoreau on his deathbed if he believed in an afterlife. "One world at a time," he is said to have replied—a response that shows us what religious maturity might look like.
As a Christian academic, I didn't see Paul Bloom's article as a threat to my faith. But I was bothered by its lack of rigorous academic honesty. It read like a tract straight out of the Church of Evolution, penned by the high priests of Science.
Of course, to Bloom and his ilk, if it isn't quantifiable, it isn't true. And since science is in the business of quantifying and measuring, it naturally follows that if it isn't science, it isn't true.
G. K. Chesterton once said, "A madman is not someone who has lost his reason. A madman is someone who has lost everything but his reason." In that sense, much of Bloom's argument was entirely reasonable.
Michael BrunerAzusa Pacific University
Paul Bloom replies:
I appreciate the thoughtful remarks of John Burgeson and Thomas Pryor, but I disagree with them. Something can be universal or nearly universal without being the direct product of natural selection. Examples include back pain, visual aftereffects, hiccups, masturbation, and self-pity. A propensity for supernatural belief might emerge in a similar way—as a by-product of certain traits, including social reasoning, that are themselves adaptations. And while elaborate religious systems are shaped by culture, the foundations of these systems—the seeds of religious thought—show up even in babies.
How, then, as David Bacon asks, can we explain the small minority of people who have no spiritual belief? Every human trait displays variation, in some cases because of genes and in others because of environment, and we could explore the cluster of personality factors that lead people toward skepticism. But one can also deny the premise of this question: while our individual conscious attitudes plainly differ, there is evidence that all of us hold some supernatural beliefs at a gut level. Bacon might explicitly reject the supernatural, but I would bet that he sees himself as somehow independent from his body and finds the notion of life after death to be, at least at an intuitive level, entirely sensible.
In this regard religion has an edge over science. On the other hand, as Winslow Myers nicely points out, scientific explanations have their own advantages. When you look at such accounts of the origin of the universe, the evolution of species, or the nature of matter, they are far more interesting—indeed, more beautiful—than the sorts of stories one finds in religious texts.
You don't have to be an atheist to be interested in why children believe in God. The study of why people have religious beliefs in no way challenges the truth of such beliefs. But some of these ideas, such as the view that human beings were created in their current form by a supernatural being, are just mistaken, and deserve to be treated in the same way as the notion that the earth is flat.
This last point might be what upsets Michael Bruner, a self-described Christian academic. It is hard to tell, since instead of explaining his position he settles for accusing me of dishonesty and dogmatism. This does not seem very Christian, or very academic. But I did get a kick out of being called a high priest of Science.
The Atlantic's November College Admissions section completely ignores some of the more significant problems with the SAT and its use in the admissions process.
As a regional test-prep firm based in Chicago's North Shore suburbs, we have long been familiar with the advantage wealthier students have when preparing for the SAT and the ACT. Not only can they pay for courses and private tutoring, but we have recently seen a greater number of students be declared "learning disabled" and thus receive valuable extra time or other accommodations on standardized tests. Many other students are prescribed performance-enhancing drugs such as Ritalin and Adderall, ostensibly to treat ADHD and other disorders.
The result is a skewing of SAT scores in favor of students wealthy and savvy enough to take advantage of these opportunities. One needn't make judgments about the value of various accommodations to agree that poor students inevitably receive far fewer of them.
Jay BrodyBrody Admissions
Ross Douthat ("Does the Meritocracy Work?," November Atlantic) offers damning statistics to show that kids who are admitted to the best colleges tend to come from wealthier-than-average families. However, these statistics ignore one critical point: families with college-age kids are by definition not average. For one thing, they are older than average by a considerable amount. The median income for all families in 2001 was about $42,000 a year. But the median income for families headed by forty-five-to-fifty-four-year-olds—those most likely to have college-age kids—was $58,000.
I doubt this explains all the economic disparities Douthat complains of, but I suspect it explains a lot of them.
Ross Douthat's article was very interesting to me. I worked as a social worker for seven years for the state of Alabama and therefore had extensive contact with the poor. In order to work with them more effectively, I would send them for a psychological evaluation and IQ test. Their scores on the IQ tests were consistently low, around 75 to 80. This is the reason the poor are not attending college in large numbers. They do not have the intelligence (an IQ of 110 or higher) necessary to do college-level work.
My poor clients frequently told me they liked to work with their hands. They wanted good-paying factory jobs. We need to accept the lower quadrant in society with their limitations, and not outsource all the factory jobs overseas. All work is honorable. These people can contribute to society.
Seventy-five percent of the population has an IQ under 110. A Department of Education study shows twenty-nine out of every 100 students getting a degree, which is predictable if we consider how IQ limits who can do college-level work. I do not agree with Douthat's statement that IQ is not inherited. Studies of children who are adopted show that they have IQs within five points of their biological parents', regardless of their environment.
Colleges are not discriminating against lower-income people; nature has limited what they can do. Perhaps the intellectual elite should have more contact with the poor so that they can understand them better before they write articles about them.
Katherine Owen Sechrist
Mountain Brook, Ala.
I am in full agreement with Ross Douthat that higher education is not always about meritocracy. The barriers to college attendance and graduation for low-income students are immense and unreasonable, and too many institutions and individuals in higher education lose sight of social justice in the largely artificial rankings race. We need to get creative about admissions solutions, because higher education is and should be accessible to all students regardless of their ability to pay. The more higher education reproduces socioeconomic inequities, the more our society loses in terms of progressive thought, practice, and innovation.
However, Douthat's argument that graduates of elite or highly selective schools enjoy a lifetime of advantage completely discounts the fact that many of them work to reverse social and economic inequities in their postgraduate lives. Many have a "pay it forward" mentality that places a premium on using their education to change society in positive, more egalitarian ways. Students who have been raised with more resources than others often share this commitment with their families.
My fellow Stanford graduates (class of 1994), themselves from diverse economic backgrounds, are spread around the world, working to improve conditions among the impoverished and the disadvantaged on a global scale. Closer to home, they are in county hospitals, public schools, and philanthropic organizations. They are leading diversity initiatives in the workplace, mentoring young people who have few other sources of social support, and imbuing business with social consciousness. They are raising children with an emphasis on thinking outward, not inward. They are not fanning privilege.
Los Angeles, Calif.
Ross Douthat replies:
Dave Munger makes a good point about how family income changes with parental age; ideally, studies of family income and college access would focus specifically on families with college-age students. I'm skeptical, however, that parental age explains "a lot" of the economic disparities. The child of a family earning the median income he cites, for instance, would have at least a 13 percent chance of earning a bachelor's degree by age twenty-four. A child from a family making more than $90,000, on the other hand, has a 50 percent chance of earning a B.A. over the same period.
I will not wade into the marshes of the IQ debate, but Katherine Owen Sechrist is no doubt right that many young Americans would be unlikely to succeed in college, or attend at all, even with a more level admissions playing field. The problem is that many low-income students who are prepared for college, and who score well on standardized tests, either don't go or enroll and then drop out after encountering economic and cultural hurdles. It's this population we should be concerned about, for the sake of both social mobility and America's long-term economic competitiveness.
And I only wish I could believe Shannon Gilmartin's appealing but unsupported assertion that more than a small percentage of young elites will spend their lives "working to improve conditions among the impoverished and the disadvantaged on a global scale."
I enjoyed Bernard-Henri Lévy's account of his visit with Norman Mailer in Provincetown ("In the Footsteps of Tocqueville," November Atlantic). His comments, however, do not reveal a close reading of Mailer's work. Lévy says that the hero of Tough Guys Don't Dance is gay. Tim Madden is straight, as is clearly revealed in several places in the novel, especially in the long opening conversation with his father. A more egregious error is Lévy's assertion that Mailer is "the most secular of American novelists." Even a cursory reading of Advertisements for Myself, Of a Fire on the Moon, Miami and the Siege of Chicago, or, most important, The Gospel According to the Son reveals Mailer's deep, idiosyncratic religiosity. For more than forty years he has written about his belief in a limited, imperfect God locked in a struggle with a powerful and wily Devil. Humanity is a third, often co-equal force, sometimes on the side of good and sometimes on that of evil. Mailer's well-considered theological beliefs are palpably obvious to his readers, who are many.
J. Michael LennonWilkes University
Clark McCann (Letters to the Editor, November Atlantic) charges that Bernard-Henri Lévy "smears" the "intelligent design" advocate Jonathan Wells and his book Icons of Evolution. Forgive me for replying to another reader's letter, but there's always the chance someone might believe it to be accurate.
McCann begins by ridiculing the idea that Wells is part of "a Moonie conspiracy against evolution." But read Wells's own words, as quoted in the Science review of Icons: "Father's [Moon's] words, my studies, and my prayers convinced me that I should devote my life to destroying Darwinism, just as many of my fellow Unificationists had already devoted their lives to destroying Marxism. When Father chose me to enter a Ph.D. program in 1978, I welcomed the opportunity to prepare myself for battle."
I would need more space than a letter allows to demolish every point Wells makes in his book, so I will offer only my favorite example of his poor scholarship and limited understanding of science—one from my own field, molecular phylogenetics. In Icons, Wells attempts to cast doubt on the ability of DNA and protein sequences to untangle the common ancestry of species. He picks what he thinks are obviously ridiculous results from the scientific literature, and among these is one that "puts cows closer to whales than to horses." Now, anyone who has been following the subject lately knows that strong confirmation of this result has come from both molecular and fossil data. Not only are whales related to cows, but they are in fact close relatives of hippos, all in the group of even-toed ungulates, Artiodactyla. Horses are members of the odd-toed ungulates, Perissodactyla. To be fair, most of this confirmation came after Wells's book was published. But it does handily refute his argument.
Wells's book is scientifically useless. I am unaware of any textbook that was, or needed to be, corrected based on anything Wells wrote.
San Jose, Calif.
John Sellers's "Who Will Win the Nobel Peace Prize?" (The Odds, November Atlantic) misidentified the 2005 Nobel Peace Prize nominee Sri Sri Ravi Shankar as the sitar player Ravi Shankar. Sri Sri Ravi Shankar is the founder of the Art of Living Foundation and the International Association for Human Values, both nonprofit NGOs affiliated with the UN. I've volunteered for various projects of these fine organizations.
John Sellers replies:
Because the Nobel Foundation cannot, by its own statutes, release the names of nominees until fifty years after the fact, it is difficult to determine whether someone has been nominated or not. Any of the people or organizations that are eligible to nominate candidates may announce on their own whom they have nominated, but very few do so. There is plenty of documented speculation that Sri Sri Ravi Shankar was nominated, but no one, to my knowledge, has officially come out and said so. Meanwhile, it was also reported in the press that the sitarist Ravi Shankar was nominated. It is quite possible that both esteemed Shankars were nominated, or that neither was. We won't know for certain, however, until 2055. Which, of course, makes the Nobel Peace Prize all the more thrilling to bet on.
Tyler Cabot omitted a critical piece of evidence in his discussion of Saint "Padre" Pio ("The Rocky Road to Sainthood," November Atlantic). The Reverend Carlo Maccari, who alleged in a 1960 Vatican report that Padre Pio had had sexual relations with female penitents twice a week, later admitted that his accusations were false, and prayed to the falsely accused on his deathbed.
Tyler Cabot replies:
The assertion that Maccari recanted on his deathbed has never been substantiated by objective sources. The claim seems to have originated in The Voice of Padre Pio, a Capuchin magazine read by followers of Pio and published by the friary where he once lived as a Capuchin monk, and where he is now entombed and celebrated.
Pio was an unusually controversial figure in life—as beleaguered by allegations of impropriety as he was revered for his supposed mystical powers. And as Paul Charles's letter amply demonstrates, he remains a controversial figure in death.
I read with interest the article by Emily Bazelon ("What Would Zimbabwe Do?") in the November Atlantic. I was, however, taken aback by her comment in the last paragraph, where she refers to laws that "limit free speech in Canada." As a citizen of Canada, I am unaware that my free speech is limited, and I wonder whether Bazelon would care to elaborate. I would not like to continue to voice my opinions so openly if I am contravening legal statutes.
Sharon Coulter Nichol
Fairmont Hot Springs, B.C.
Emily Bazelon replies:
Canada's Charter of Rights and Freedoms treats as "fundamental" the rights to free speech and freedom of the press. But the charter makes these rights subject to "such reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society." In other words, Canadian free-speech rights have a built-in check. In some contexts the country's courts have interpreted the charter to allow for more suppression of speech than American law permits. In 1990 Canada's supreme court upheld a law barring hate speech. In 1992 the court adopted a relatively broad definition of obscenity, including material that exploits sex in a "degrading or dehumanizing" manner. And in 2002 a lower court outraged some civil libertarians by finding a man guilty of violating Saskatchewan's Human Rights Code after he placed an advertisement in a local newspaper. The ad was for a bumper sticker. It cited (without quoting) biblical passages that condemn some homosexual acts and showed two male stick figures holding hands, standing in a circle with a slash through it.
The December 2005 issue of The Atlantic Monthly features a review of Yael Hedaya's latest novel, Accidents. I was pleased to find in your publication a very thoughtful review of the work of this overlooked author. However, as the translator of the novel, I was disappointed to see that not only was my name not credited, but there was no mention at all that this novel is a translation. Neglecting to credit the translator, without whom most of your readers would not be able to read the book, is an unfortunate oversight that reflects a general lack of awareness of the art of literary translation. Literary translators invest a great deal of time and creative energy to expose English-speaking readers to works they would not otherwise have access to, and their efforts should be recognized.
The Editors reply:
Our most sincere apologies to Jessica Cohen, who should certainly have been credited in the review. For lovers of literature and knowledge around the globe, she and her colleagues perform an invaluable service in the exercise of their art.
Regarding "Things Left Undone," by Richard A. Clarke (November Atlantic): "Perfect" response to large-scale disasters is as improbable as a baseball team's batting 1.000. And the "misses" are human tragedies. FEMA, as its name implies, is designed to "manage" an emergency, and it cannot be expected to protect every citizen against being hurt.
But can a crisis be prevented? In most cases, yes. The scenarios that unfolded after Katrina's landfall were predictable and preventable, even if the storm itself was not.
First, flooding in New Orleans started well after Katrina had passed and could certainly have been averted. The floodwall design on the 17th Street Canal was an obvious weakness, and the possibility of its failure should at least have been considered. Mitigation would have been as easy as installing relatively inexpensive floodgates at the entrances of each canal into Lake Pontchartrain or the Intracoastal Waterway. Once closed, they would have effectively prevented the backflow of the lake through a canal breach. The levee boards and the Army Corps of Engineers had plenty of money, but not enough foresight or commitment. A fatal mistake!
As for the evacuation, to say that it was botched is an understatement. The death toll would have been beyond imagination if the storm had moved thirty miles farther to the west and left in New Orleans the type of destruction that was found in Mississippi. No effective response to such a scenario is imaginable other than preventing it in the first place by executing a perfect evacuation.
A perfect evacuation would be difficult but not impossible. It would certainly be far easier to achieve than a "perfect response," and far more effective than plucking hapless stragglers from rooftops with helicopters. Plans would have to be detailed down to the level of the individual citizen. Every citizen could be assigned a time to leave and a place to go after the evacuation is called. Rolling police barricades, emergency programming for traffic lights, and color-coded window stickers for cars could aid in traffic control. Before the start of hurricane season every citizen could be issued a "hurricane wristband" with bar-code information for identification and emergency-aid authorization. Activated by ZIP code when the evacuation is ordered, it could revolutionize the supply of goods to evacuees by outsourcing emergency assistance to selected partners like Wal-Mart and Target. Fraud, waste, and hassle would be eliminated. And nobody would be lost, since the bar-code readers would track people's movements.
Can a city the size of New Orleans afford to go through an evacuation process like this each time it comes within the seventy-two-hour landfall probability cone of a hurricane? It cannot afford not to.