m_topn picture
Atlantic Monthly Sidebar

Return to this issue's Table of Contents.

N O V E M B E R  1 9 9 9

77 Notes & Comment Factor Analysis Illustration by Seth

A newly pervasive linguistic trope raises a question: Have we stumbled on the basic unit of social causality?

by Cullen Murphy

POLITICAL commentators assessing the prospects for Al Gore in his race for the presidency and Hillary Rodham Clinton in her presumptive race for the United States Senate have in recent months drawn attention to a phenomenon that could dampen public enthusiasm for both candidates: "Clinton-fatigue factor." The columnist Bob Herbert, writing in The New York Times about Hillary Clinton's campaign, defined Clinton-fatigue factor succinctly: "A lot of voters have had their fill of Bill and Hillary and would like to move on." Clinton-fatigue factor may be a palpable reality, but it is by no means the only factor that has been enjoying prominence in the news. Last spring Steven Spielberg's relatively modest showing at the Academy Awards -- his Saving Private Ryan received eleven nominations but won only five Oscars -- was attributed by an industry observer to a combination of the "Him again? factor" and the "What more does he need? factor." In Hollywood, according to one publication, the essential component of star quality is the "'it' factor"; modeling agencies, the same source says, are looking for a very different component -- the "X-factor." The movie The Mummy, a Los Angeles Times critic decided, was given momentum by the "squirm factor." Armageddon succeeded because of the "testosterone factor."
Discuss this article in Post & Riposte. Economists, I notice, have been calling attention to a "feel-good factor" that has bolstered consumer confidence and prepared the way for a "drool factor," lubricating retail sales. The "spinach factor" refers to a potentially off-putting amount of actual substance in something intended for mass-market entertainment; of course, the "dreck factor" can also prove off-putting. People will often sit and stare anyway, media analysts observe, owing to the "If it's on, watch it factor."

Bill Clinton figures again in the "It could be me factor" -- cited by a writer in Time to explain the public's reluctance to judge the President's personal behavior harshly. Clinton-fatigue factor aside, Al Gore is said by the columnist David Broder to have benefited from the "G factor" -- that is, becoming a grandfather -- even as George W. Bush draws strength from the "inevitability factor." An aversion to public discussion of distasteful subjects is brought on by the "yuck factor." The attraction of journalists to predictions of doom originates with the "Chicken Little factor." Having survived the yuck factor and the Chicken Little factor, the Clinton Administration has been advised by political operatives to beware of the "gloat factor."

Pick a phenomenon and some sort of factor will almost always be found at the bottom of it. Hundreds have been isolated and identified in recent years. It used to be that explanations for complex behavioral outcomes were amorphous and murky -- insights to be approached by means of the hunch, the gut feeling, the surmise. These crude tools are now things of the past, replaced by new instruments of refinement and power.

* * *

All this needs to be seen against a larger backdrop. In his book The End of Science the writer John Horgan makes the audacious claim that human understanding has closed in on all the big verifiable truths about the universe and how it works: the Big Bang, the theory of relativity, the forces governing matter, the fact of evolution, the mechanism of genetics. The sad truth, Horgan argues, is that "the great era of scientific discovery is over." Although a number of significant questions do remain unanswered (What is consciousness? Why did life begin?), the answers are necessarily speculative and therefore beyond the proper realm of science.

Related link:

Online NewsHour: "The End of Science" (July 26, 1996)
An interview transcript: "David Gergen, editor at large of U.S. News & World Report, engages John Horgan, author of The End of Science, Facing the Limits of Knowledge in the Twilight of the Scientific Age."

For purely scientific questions the task ahead has been reduced to filling in the details. Horgan states, "Further research may yield no more great revelations or revolutions, but only incremental, diminishing returns."

I don't know if Horgan is right about the natural sciences; his book has aroused a certain amount of derision. But the situation he describes stands in contrast to the one prevailing in the social sciences. There the great era of scientific discovery has scarcely begun. Specific insights have been piling up for a century or more, unaccompanied by the emergence of an overarching theory.

The insights have occasionally been stunning, to be sure. In Norway psychologists recently demonstrated that people afflicted with a "negative mood" tend to be more adept at creative problem-solving than people blessed with a happier temperament -- research that perhaps explains the technological superiority of the world's wet and chilly regions. Another example: last spring behavioral scientists at the University of Illinois College of Medicine and the Smell and Taste Treatment and Research Foundation, in Chicago, reported a correlation between the act of lying and an itch-producing engorgement of erectile tissue inside the nose -- a correlation, they say, that may support "using frequency of proboscis manipulation as an indicator of mendacity."

For the most part, though, reports of new social-science findings suggest a tentative groping toward the mundane. A study released not long ago by the New York-based research firm Public Agenda found that most adults were skeptical that the nation's teenagers would ever mature into responsible Americans. Psychologists at the National Institute for Healthcare Research have publicized an investigation of couples showing that when one member has indulged in hurtful behavior, the subsequent apology-forgiveness dynamic plays out most successfully if the couple's relationship is "close, committed, and satisfactory." Newspaper headlines, summarizing the latest research, offer minimalist validations of what one never had much cause to doubt: "INFIDELITY COMMON AMONG BIRDS AND MAMMALS, EXPERTS SAY." "STUDY SUGGESTS VALUES OF HOLLYWOOD'S ELITE MAY NOT REFLECT PUBLIC'S." "DEFINING JUST HOW OLD IS OLD VARIES WITH AGE OF THOSE ASKED." "ANXIOUS MEN RISK HYPERTENSION." "NEW YORK FIRMS CALLED PRICEY, ARROGANT."

These may be solid little bricks in the edifice of understanding, but the structure they define has not grown very imposing.

* * *

Until recently. As noted, the accelerating interest in factors holds out the promise of something larger. In some guise the idea of the factor has, of course, been around for decades, in phrases like "human factor" and "risk factor" and "deciding factor." Now it seems to be emerging as the basic unit of social and behavioral causality. Is it premature to hope that factors will do for the social sciences what genes have done for the study of heredity and "memes" are doing for the study of cultural evolution? Tellingly, the gene itself, when first postulated in the early years of this century, was given the name "factor."

The task of mapping the human genome, under the auspices of the Human Genome Project and other groups, is nearly complete. A Social Factor Project, comparable in scope, would be a worthy successor to that enterprise. I have run this proposal by the biologists Stephen Jay Gould and E. O. Wilson, the physicists Murray Gell-Mann and Stephen Hawking, and the senator and social thinker Daniel Patrick Moynihan, and have received not the slightest sign of discouragement (or, indeed, any personal response). Meanwhile, I have begun parceling out chunks of the work to interested parties. The results, gleaned from ordinary newspapers and magazines, are coming in faster than expected. One day brings discovery of a "feng-shui factor," the next of a "kilt-movie factor." Those feelings of annoyance at the sight of someone talking on a cell phone in public? According to a recent news report, telephone-company executives have pinpointed the culprit: the "yuppie-scum factor."

Critics will charge that factor analysis is overly reductionist. Some will fret about disturbing uses to which our new knowledge may be put. Ideological disputes are likely. All these are a normal part of any discipline's development, and no cause for alarm. The next step is federal funding. An e-mail from Senator Moynihan's office said that my views were welcome. Whether the proboscis-manipulation factor was at work could not be determined.


Cullen Murphy is the author of The Word According to Eve: Women and the Bible in Ancient Times and Our Own (1998).

Illustration by Seth.

Copyright © 1999 by The Atlantic Monthly Company. All rights reserved.
The Atlantic Monthly; November 1999; Factor Analysis - 99.11; Volume 284, No. 5; page 16-18.