Factor Analysis

A newly pervasive linguistic trope raises a question: Have we stumbled on the basic unit of social causality?

POLITICAL commentators assessing the prospects for Al Gore in his race for the presidency and Hillary Rodham Clinton in her presumptive race for the United States Senate have in recent months drawn attention to a phenomenon that could dampen public enthusiasm for both candidates: "Clinton-fatigue factor." The columnist Bob Herbert, writing in The New York Times about Hillary Clinton's campaign, defined Clinton-fatigue factor succinctly: "A lot of voters have had their fill of Bill and Hillary and would like to move on." Clinton-fatigue factor may be a palpable reality, but it is by no means the only factor that has been enjoying prominence in the news. Last spring Steven Spielberg's relatively modest showing at the Academy Awards -- his Saving Private Ryan received eleven nominations but won only five Oscars -- was attributed by an industry observer to a combination of the "Him again? factor" and the "What more does he need? factor." In Hollywood, according to one publication, the essential component of star quality is the "'it' factor"; modeling agencies, the same source says, are looking for a very different component -- the "X-factor." The movie The Mummy, a Los Angeles Times critic decided, was given momentum by the "squirm factor." Armageddon succeeded because of the "testosterone factor."

Economists, I notice, have been calling attention to a "feel-good factor" that has bolstered consumer confidence and prepared the way for a "drool factor," lubricating retail sales. The "spinach factor" refers to a potentially off-putting amount of actual substance in something intended for mass-market entertainment; of course, the "dreck factor" can also prove off-putting. People will often sit and stare anyway, media analysts observe, owing to the "If it's on, watch it factor."

Bill Clinton figures again in the "It could be me factor" -- cited by a writer in Time to explain the public's reluctance to judge the President's personal behavior harshly. Clinton-fatigue factor aside, Al Gore is said by the columnist David Broder to have benefited from the "G factor" -- that is, becoming a grandfather -- even as George W. Bush draws strength from the "inevitability factor." An aversion to public discussion of distasteful subjects is brought on by the "yuck factor." The attraction of journalists to predictions of doom originates with the "Chicken Little factor." Having survived the yuck factor and the Chicken Little factor, the Clinton Administration has been advised by political operatives to beware of the "gloat factor."

Pick a phenomenon and some sort of factor will almost always be found at the bottom of it. Hundreds have been isolated and identified in recent years. It used to be that explanations for complex behavioral outcomes were amorphous and murky -- insights to be approached by means of the hunch, the gut feeling, the surmise. These crude tools are now things of the past, replaced by new instruments of refinement and power.

* * *

All this needs to be seen against a larger backdrop. In his book the writer John Horgan makes the audacious claim that human understanding has closed in on all the big verifiable truths about the universe and how it works: the Big Bang, the theory of relativity, the forces governing matter, the fact of evolution, the mechanism of genetics. The sad truth, Horgan argues, is that "the great era of scientific discovery is over." Although a number of significant questions do remain unanswered (What is consciousness? Why did life begin?), the answers are necessarily speculative and therefore beyond the proper realm of science.


For purely scientific questions the task ahead has been reduced to filling in the details. Horgan states, "Further research may yield no more great revelations or revolutions, but only incremental, diminishing returns."

I don't know if Horgan is right about the natural sciences; his book has aroused a certain amount of derision. But the situation he describes stands in contrast to the one prevailing in the social sciences. There the great era of scientific discovery has scarcely begun. Specific insights have been piling up for a century or more, unaccompanied by the emergence of an overarching theory.

The insights have occasionally been stunning, to be sure. In Norway psychologists recently demonstrated that people afflicted with a "negative mood" tend to be more adept at creative problem-solving than people blessed with a happier temperament -- research that perhaps explains the technological superiority of the world's wet and chilly regions. Another example: last spring behavioral scientists at the University of Illinois College of Medicine and the Smell and Taste Treatment and Research Foundation, in Chicago, reported a correlation between the act of lying and an itch-producing engorgement of erectile tissue inside the nose -- a correlation, they say, that may support "using frequency of proboscis manipulation as an indicator of mendacity."

For the most part, though, reports of new social-science findings suggest a tentative groping toward the mundane. A study released not long ago by the New York-based research firm Public Agenda found that most adults were skeptical that the nation's teenagers would ever mature into responsible Americans. Psychologists at the National Institute for Healthcare Research have publicized an investigation of couples showing that when one member has indulged in hurtful behavior, the subsequent apology-forgiveness dynamic plays out most successfully if the couple's relationship is "close, committed, and satisfactory." Newspaper headlines, summarizing the latest research, offer minimalist validations of what one never had much cause to doubt: "INFIDELITY COMMON AMONG BIRDS AND MAMMALS, EXPERTS SAY." "STUDY SUGGESTS VALUES OF HOLLYWOOD'S ELITE MAY NOT REFLECT PUBLIC'S." "DEFINING JUST HOW OLD IS OLD VARIES WITH AGE OF THOSE ASKED." "ANXIOUS MEN RISK HYPERTENSION." "NEW YORK FIRMS CALLED PRICEY, ARROGANT."

These may be solid little bricks in the edifice of understanding, but the structure they define has not grown very imposing.

* * *

Until recently. As noted, the accelerating interest in factors holds out the promise of something larger. In some guise the idea of the factor has, of course, been around for decades, in phrases like "human factor" and "risk factor" and "deciding factor." Now it seems to be emerging as the basic unit of social and behavioral causality. Is it premature to hope that factors will do for the social sciences what genes have done for the study of heredity and "memes" are doing for the study of cultural evolution? Tellingly, the gene itself, when first postulated in the early years of this century, was given the name "factor."

The task of mapping the human genome, under the auspices of the Human Genome Project and other groups, is nearly complete. A Social Factor Project, comparable in scope, would be a worthy successor to that enterprise. I have run this proposal by the biologists Stephen Jay Gould and E. O. Wilson, the physicists Murray Gell-Mann and Stephen Hawking, and the senator and social thinker Daniel Patrick Moynihan, and have received not the slightest sign of discouragement (or, indeed, any personal response). Meanwhile, I have begun parceling out chunks of the work to interested parties. The results, gleaned from ordinary newspapers and magazines, are coming in faster than expected. One day brings discovery of a "feng-shui factor," the next of a "kilt-movie factor." Those feelings of annoyance at the sight of someone talking on a cell phone in public? According to a recent news report, telephone-company executives have pinpointed the culprit: the "yuppie-scum factor."

Critics will charge that factor analysis is overly reductionist. Some will fret about disturbing uses to which our new knowledge may be put. Ideological disputes are likely. All these are a normal part of any discipline's development, and no cause for alarm. The next step is federal funding. An e-mail from Senator Moynihan's office said that my views were welcome. Whether the proboscis-manipulation factor was at work could not be determined.


Cullen Murphy is the author of (1998).


Illustration by Seth.

The Atlantic Monthly; November 1999; Factor Analysis - 99.11; Volume 284, No. 5; page 16-18.


Presented by

Cullen Murphy

Says Cullen Murphy, "At The Atlantic we try to provide a considered look at all aspects of our national life; to write, as well, about matters that are not strictly American; to emphasize the big story that lurks, untold, behind the smaller ones that do get told; and to share the conclusions of our writers with people who count."

Murphy served as The Atlantic Monthly's managing editor from 1985 until 2005, when the magazine relocated to Washington. He has written frequently for the magazine on a great variety of subjects, from religion to language to social science to such out-of-the-way matters as ventriloquism and his mother's method for pre-packaging lunches for her seven school-aged children.

Murphy's book Rubbish! (1992), which he co-authored with William Rathje, grew out of an article that was written by Rathje, edited by Murphy, and published in the December, 1989, issue of The Atlantic Monthly. In a feature about the book's success The New York Times reported that the article "was nominated for a National Magazine Award in 1990 and became a runaway hit for The Atlantic Monthly, which eventually ran off 150,000 copies of it." Murphy's second book, Just Curious, a collection of his essays that first appeared in The Atlantic Monthly and Harper's, was published in 1995. His most recent book, The Word According to Eve: Women and The Bible in Ancient Times and Our Own, was published in 1998 by Houghton Mifflin. The book grew out of Murphy's August 1993 Atlantic cover story, "Women and the Bible."

Murphy was born in New Rochelle, New York, and grew up in Greenwich, Connecticut. He was educated at Catholic schools in Greenwich and in Dublin, Ireland, and at Amherst College, from which he graduated with honors in medieval history in 1974. Murphy's first magazine job was in the paste-up department of Change, a magazine devoted to higher education. He became an editor of The Wilson Quarterly in 1977. Since the mid-1970s Murphy has written the comic strip Prince Valiant, which appears in some 350 newspapers around the world.

How to Build a Tornado

A Canadian inventor believes his tornado machine could solve the world's energy crisis. The only problem? He has to prove it works.

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register with Disqus.

Please note that The Atlantic's account system is separate from our commenting system. To log in or register with The Atlantic, use the Sign In button at the top of every page.

blog comments powered by Disqus

Video

How to Build a Tornado

A Canadian inventor believes his tornado machine could solve the world's energy crisis.

Video

A New York City Minute, Frozen in Time

This short film takes you on a whirling tour of the Big Apple

Video

What Happened to the Milky Way?

Light pollution has taken away our ability to see the stars. Can we save the night sky?

Video

The Faces of #BlackLivesMatter

Scenes from a recent protest in New York City

Video

Ruth Bader Ginsburg on Life

The Supreme Court justice talks gender equality and marriage.

Video

The Pentagon's $1.5 Trillion Mistake

The F-35 fighter jet was supposed to do everything. Instead, it can barely do anything.
More back issues, Sept 1995 to present.

Just In