Never Mind

Old science doesn't die ...

By Cullen Murphy

For nearly thirty years Cambridge University's Stephen W. Hawking has been the cosmologist most closely associated in the public mind with the phenomenon of black holes—cosmic concentrations of mass so dense that nothing can escape from them. The basic idea behind black holes has a long pedigree. The English geologist John Michell speculated centuries ago that a celestial body with a radius 500 times greater than that of the sun, but with the same density, would possess an escape velocity at its surface equal to the speed of light, meaning that light could not escape: "all light emitted from such a body would be made to return towards it, by its own proper gravity." Theorizing about black holes picked up considerable momentum after Einstein, and during the past several decades in particular has been the focus of much speculation. It was Stephen Hawking who introduced the idea of the "event horizon" (the outer boundary of a black hole), and it was Hawking who in 1976 proposed that black holes were essentially omnivorous, swallowing not only matter but also information (although it was possible, he thought, that the information might escape into new "baby universes" forming inside the black hole). Hawking's ideas about information loss contradicted known laws of physics. They led to deep divisions among scientists, and a vast amount of science fiction.

Now, after years of fuss, Hawking has conceded error. At the last minute he contacted the organizers of the 17th International Conference on General Relativity and Gravitation, to be held in Dublin. Hawking's message: "I have solved the black hole information paradox and I want to talk about it." When his turn came to speak at the conference, he had this to say: "I am sorry to disappoint science-fiction fans, but if you jump into a black hole, your mass energy will be returned to our universe, but in mangled form. There is no baby universe branching off, as I once thought." Emily Litella, a Gilda Radner character on Saturday Night Live, used to deliver editorial rants against "free Soviet jewelry" and "making Puerto Rico a steak," only to correct herself meekly when informed of her error by a colleague. Stephen Hawking's comments in Dublin amounted to a cosmological "Never mind."

It is always a little disconcerting when audacious scientific theories come a cropper. Sometimes what is lost is not just a specific explanation but a whole way of thinking, an entire world view. Perhaps for that reason, old scientific theories do not wholly die. Oscar Wilde once observed that "science is the record of dead religions." He might have added, in a codicil, that "metaphor is the record of dead science."

The theory of black holes hasn't been discredited in its entirety, just one of its more intriguing postulates. But even if the theory itself were sucked into a black hole, it's hard to believe that the black-hole metaphor—for the bedrooms of certain children, the minds of certain friends, the legal status of certain detainees—wouldn't be around more or less forever. Here's The Boston Globe commenting on memories of the 1960s at Democratic conventions: "They loom stage left, a blur to many of us, a black hole to the rest." Here's the Omaha World Herald on the fortunes of a neighboring state: "While Iowa has been the center of the whirlwind at caucus time, by November the state usually has been sucked into a political black hole."

Theories rejected eons ago as inadequate for the narrow purposes of science have proved far too useful to reject in the broader world of normal life. The idea that a Great Flood once destroyed most life on the planet is now untenable, but we still think of antiquated people and ideas as being antediluvian, "before the deluge." I am aware, and accept, that ships sailing off to the horizon will not actually reach a place where the world ends and the oceans spill off into the void. But regardless of what the geographers say, "falling off the face of the earth" is something that happens—to fashions, to celebrities, to popular culture. It is the explanation for any number of phenomena: What happened to the huge surplus that Gore and Bush sparred over? Where are Erik Estrada and Menudo, Vanilla Ice and Andrew Dice Clay?

I know that technically the conception of Earth as lying at the center of the universe, as proposed by the Egyptian astronomer Ptolemy, is at odds with the established facts. But inadequate as it may be in cosmology, the Ptolemaic metaphor is relevant almost everywhere else. A natural-history exhibit was taken to task a few years ago for "a decidedly Ptolemaic view: the world revolves around New Jersey." Recently I saw an article about people whose lives revolve around their children—"Ptolemaic parenting," this was called. (I'm a committed Copernican myself.)

Reproductive specialists these days are understandably skeptical of the medieval notion that human beings grow to full size from a homunculus, a fully formed but hairless miniature person inhabiting sperm cells. But homunculus the metaphor continues to propagate. Usually it refers to some small, original version of a much larger thing (Wisconsin's welfare program was said to be the homunculus version of the federal Welfare Reform Act of 1996; Iceland's ancient parliament, the Althing, is the homunculus version of Congress), but it can also mean someone who resembles the homunculi depicted in illuminated treatises. I have seen the Oscar statuette referred to as a "gilded homunculus," and Aristotle Onassis as a "leathery homunculus and shipping tycoon." Gollum, Mini-Me, James Carville—in their vastly different ways they all promote homuncular vitality.

Obsolete science survives as metaphor in "philosopher's stone" (used by alchemists to turn base metals into gold) and "spontaneous generation" (the idea that life springs into existence out of nothing, or that things can happen without a cause). Both these concepts come up a lot in, for instance, discussions of economic policy. Phlogiston, the hypothetical element once thought to account for fire, gets pressed into service as a stand-in for any mythical causative substance. (Graham Greene regarded cholesterol as akin to phlogiston.) The Greek physiologist Galen believed that four "humours"—phlegm, black bile, yellow bile, and blood—accounted for all bodily functions and human behavior, an antiquated conceit that no reputable scientist would now endorse. But one still hears references to "the humours," and personally I think that a mere four elements—solipsism, debt, litigation, and hype—could easily explain about 90 percent of human activity.

Psychoanalysis is a nearly boundless category unto itself. It has not yet succumbed totally to the inroads of Prozac and Paxil, but even if drugs come to dominate the clinical future, the concepts of id, ego, superego, Oedipus complex, repression, and the rest are unlikely to atrophy in normal discourse. Superego is conscience, morality, authority—Consumer Reports, the Boy Scout Handbook, Pope John Paul II. The id is the unconscious, the source of instinctual, sometimes shameful, impulses. The actor Jim Carrey was once described as "an insatiable, rampaging id," what's left over "once the layers of civilization have been peeled away." Philip Roth's Portnoy's Complaint has been called "an emblem of the national id." The Super Bowl halftime show, Fear Factor, talk radio—all these things are manifestations of id. An automobile expert had this to say in The Washington Post about how SUV manufacturers balance consumer desires and safety: "There is this fine line that they walk between id and superego."

"Relativity"? "Big Bang"? "Quantum leap"? "Natural selection"? "The uncertainty principle"? These terms are associated with ideas that are very much alive and in good theoretical standing, but who doubts that we'd still employ them metaphorically even if the underlying scientific concepts were shelved? All I know about chaos theory is what Jeff Goldblum explained in Jurassic Park; as actual science the whole concept may be overblown. But as an expression of ordinary social dynamics—the Enron scandal, the Kennedy family, the reconstruction of Iraq—there is clearly something to it.

A few weeks after Hawking's an- nouncement Paul Ginsparg, a professor of physics at Cornell, published a newspaper commentary suggesting that the back-pedaling may have been premature—that Hawking's ideas might not in fact be antediluvian, and could still turn out to be right. As Emily Litella might have observed, the situation is now in a great steak of uncertainty.

This article available online at:

http://www.theatlantic.com/magazine/archive/2004/10/never-mind/303519/