Innocent Bystander October 2004

Never Mind

Old science doesn't die ...
More

For nearly thirty years Cambridge University's Stephen W. Hawking has been the cosmologist most closely associated in the public mind with the phenomenon of black holes—cosmic concentrations of mass so dense that nothing can escape from them. The basic idea behind black holes has a long pedigree. The English geologist John Michell speculated centuries ago that a celestial body with a radius 500 times greater than that of the sun, but with the same density, would possess an escape velocity at its surface equal to the speed of light, meaning that light could not escape: "all light emitted from such a body would be made to return towards it, by its own proper gravity." Theorizing about black holes picked up considerable momentum after Einstein, and during the past several decades in particular has been the focus of much speculation. It was Stephen Hawking who introduced the idea of the "event horizon" (the outer boundary of a black hole), and it was Hawking who in 1976 proposed that black holes were essentially omnivorous, swallowing not only matter but also information (although it was possible, he thought, that the information might escape into new "baby universes" forming inside the black hole). Hawking's ideas about information loss contradicted known laws of physics. They led to deep divisions among scientists, and a vast amount of science fiction.

Now, after years of fuss, Hawking has conceded error. At the last minute he contacted the organizers of the 17th International Conference on General Relativity and Gravitation, to be held in Dublin. Hawking's message: "I have solved the black hole information paradox and I want to talk about it." When his turn came to speak at the conference, he had this to say: "I am sorry to disappoint science-fiction fans, but if you jump into a black hole, your mass energy will be returned to our universe, but in mangled form. There is no baby universe branching off, as I once thought." Emily Litella, a Gilda Radner character on Saturday Night Live, used to deliver editorial rants against "free Soviet jewelry" and "making Puerto Rico a steak," only to correct herself meekly when informed of her error by a colleague. Stephen Hawking's comments in Dublin amounted to a cosmological "Never mind."

It is always a little disconcerting when audacious scientific theories come a cropper. Sometimes what is lost is not just a specific explanation but a whole way of thinking, an entire world view. Perhaps for that reason, old scientific theories do not wholly die. Oscar Wilde once observed that "science is the record of dead religions." He might have added, in a codicil, that "metaphor is the record of dead science."

The theory of black holes hasn't been discredited in its entirety, just one of its more intriguing postulates. But even if the theory itself were sucked into a black hole, it's hard to believe that the black-hole metaphor—for the bedrooms of certain children, the minds of certain friends, the legal status of certain detainees—wouldn't be around more or less forever. Here's The Boston Globe commenting on memories of the 1960s at Democratic conventions: "They loom stage left, a blur to many of us, a black hole to the rest." Here's the Omaha World Herald on the fortunes of a neighboring state: "While Iowa has been the center of the whirlwind at caucus time, by November the state usually has been sucked into a political black hole."

Theories rejected eons ago as inadequate for the narrow purposes of science have proved far too useful to reject in the broader world of normal life. The idea that a Great Flood once destroyed most life on the planet is now untenable, but we still think of antiquated people and ideas as being antediluvian, "before the deluge." I am aware, and accept, that ships sailing off to the horizon will not actually reach a place where the world ends and the oceans spill off into the void. But regardless of what the geographers say, "falling off the face of the earth" is something that happens—to fashions, to celebrities, to popular culture. It is the explanation for any number of phenomena: What happened to the huge surplus that Gore and Bush sparred over? Where are Erik Estrada and Menudo, Vanilla Ice and Andrew Dice Clay?

I know that technically the conception of Earth as lying at the center of the universe, as proposed by the Egyptian astronomer Ptolemy, is at odds with the established facts. But inadequate as it may be in cosmology, the Ptolemaic metaphor is relevant almost everywhere else. A natural-history exhibit was taken to task a few years ago for "a decidedly Ptolemaic view: the world revolves around New Jersey." Recently I saw an article about people whose lives revolve around their children—"Ptolemaic parenting," this was called. (I'm a committed Copernican myself.)

Reproductive specialists these days are understandably skeptical of the medieval notion that human beings grow to full size from a homunculus, a fully formed but hairless miniature person inhabiting sperm cells. But homunculus the metaphor continues to propagate. Usually it refers to some small, original version of a much larger thing (Wisconsin's welfare program was said to be the homunculus version of the federal Welfare Reform Act of 1996; Iceland's ancient parliament, the Althing, is the homunculus version of Congress), but it can also mean someone who resembles the homunculi depicted in illuminated treatises. I have seen the Oscar statuette referred to as a "gilded homunculus," and Aristotle Onassis as a "leathery homunculus and shipping tycoon." Gollum, Mini-Me, James Carville—in their vastly different ways they all promote homuncular vitality.

Obsolete science survives as metaphor in "philosopher's stone" (used by alchemists to turn base metals into gold) and "spontaneous generation" (the idea that life springs into existence out of nothing, or that things can happen without a cause). Both these concepts come up a lot in, for instance, discussions of economic policy. Phlogiston, the hypothetical element once thought to account for fire, gets pressed into service as a stand-in for any mythical causative substance. (Graham Greene regarded cholesterol as akin to phlogiston.) The Greek physiologist Galen believed that four "humours"—phlegm, black bile, yellow bile, and blood—accounted for all bodily functions and human behavior, an antiquated conceit that no reputable scientist would now endorse. But one still hears references to "the humours," and personally I think that a mere four elements—solipsism, debt, litigation, and hype—could easily explain about 90 percent of human activity.

Psychoanalysis is a nearly boundless category unto itself. It has not yet succumbed totally to the inroads of Prozac and Paxil, but even if drugs come to dominate the clinical future, the concepts of id, ego, superego, Oedipus complex, repression, and the rest are unlikely to atrophy in normal discourse. Superego is conscience, morality, authority—Consumer Reports, the Boy Scout Handbook, Pope John Paul II. The id is the unconscious, the source of instinctual, sometimes shameful, impulses. The actor Jim Carrey was once described as "an insatiable, rampaging id," what's left over "once the layers of civilization have been peeled away." Philip Roth's Portnoy's Complaint has been called "an emblem of the national id." The Super Bowl halftime show, Fear Factor, talk radio—all these things are manifestations of id. An automobile expert had this to say in The Washington Post about how SUV manufacturers balance consumer desires and safety: "There is this fine line that they walk between id and superego."

"Relativity"? "Big Bang"? "Quantum leap"? "Natural selection"? "The uncertainty principle"? These terms are associated with ideas that are very much alive and in good theoretical standing, but who doubts that we'd still employ them metaphorically even if the underlying scientific concepts were shelved? All I know about chaos theory is what Jeff Goldblum explained in Jurassic Park; as actual science the whole concept may be overblown. But as an expression of ordinary social dynamics—the Enron scandal, the Kennedy family, the reconstruction of Iraq—there is clearly something to it.

A few weeks after Hawking's an- nouncement Paul Ginsparg, a professor of physics at Cornell, published a newspaper commentary suggesting that the back-pedaling may have been premature—that Hawking's ideas might not in fact be antediluvian, and could still turn out to be right. As Emily Litella might have observed, the situation is now in a great steak of uncertainty.

Cullen Murphy is The Atlantic's managing editor.
Jump to comments
Presented by

Cullen Murphy

Says Cullen Murphy, "At The Atlantic we try to provide a considered look at all aspects of our national life; to write, as well, about matters that are not strictly American; to emphasize the big story that lurks, untold, behind the smaller ones that do get told; and to share the conclusions of our writers with people who count."

Murphy served as The Atlantic Monthly's managing editor from 1985 until 2005, when the magazine relocated to Washington. He has written frequently for the magazine on a great variety of subjects, from religion to language to social science to such out-of-the-way matters as ventriloquism and his mother's method for pre-packaging lunches for her seven school-aged children.

Murphy's book Rubbish! (1992), which he co-authored with William Rathje, grew out of an article that was written by Rathje, edited by Murphy, and published in the December, 1989, issue of The Atlantic Monthly. In a feature about the book's success The New York Times reported that the article "was nominated for a National Magazine Award in 1990 and became a runaway hit for The Atlantic Monthly, which eventually ran off 150,000 copies of it." Murphy's second book, Just Curious, a collection of his essays that first appeared in The Atlantic Monthly and Harper's, was published in 1995. His most recent book, The Word According to Eve: Women and The Bible in Ancient Times and Our Own, was published in 1998 by Houghton Mifflin. The book grew out of Murphy's August 1993 Atlantic cover story, "Women and the Bible."

Murphy was born in New Rochelle, New York, and grew up in Greenwich, Connecticut. He was educated at Catholic schools in Greenwich and in Dublin, Ireland, and at Amherst College, from which he graduated with honors in medieval history in 1974. Murphy's first magazine job was in the paste-up department of Change, a magazine devoted to higher education. He became an editor of The Wilson Quarterly in 1977. Since the mid-1970s Murphy has written the comic strip Prince Valiant, which appears in some 350 newspapers around the world.

Get Today's Top Stories in Your Inbox (preview)

Sad Desk Lunch: Is This How You Want to Die?

How to avoid working through lunch, and diseases related to social isolation.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

More in Technology

More back issues, Sept 1995 to present.

Just In