The Utmost Measures

A word in behalf of subjectivity
More

About 75 million Americans have takenthe SAT for college admission since the Scholastic Aptitude Test (as it was then known) was first administered, in 1926, to 8,040 high school seniors. Last summer the College Board, which oversees the test, announced plans for significant revisions, scheduled to go into effect in the spring of 2005. The biggest change is the addition of an extemporaneous essay. Together with some grammar questions, the essay will constitute a third part of the test, joining the revised math and verbal sections. Students will be given an essay topic; some possibilities mentioned by the College Board include "Nothing requires more discipline than freedom," "The greatest griefs are those we cause ourselves," and "Novelty is too often mistaken for progress."

Educators will argue for years over the wisdom of the specific changes, and over what sorts of students stand to benefit and what sorts stand to lose. But two aspects of the new SAT are welcome developments on their face.

The first is that all those who have ever bragged about their SAT scores will suddenly see their claims undermined. Because there will soon be three sections, worth 800 points apiece, the best possible scores will total 2400; the former gold standard of 1600 appears deeply mediocre by comparison. Supplying an explanatory footnote—as history books do, say, to translate the value of ancient drachmas into modern dollars—will be awkward at best. The entire Baby Boom generation will be in the position of having to make the case to its grandchildren that back in the sixties or seventies an SAT score of 1490 or 1540 was actually not bad—like grandparents today who must patiently explain, to vaguely disbelieving young people, that "five thousand dollars was quite a lot of money in those days."

The second welcome development is the element of subjective judgment, which both the writing and the grading of the essays will require. Subjective judgment has been maligned for decades, on grounds of capriciousness and unfairness, and because it is "unscientific." Standardized testing was, of course, an attempt to get away from subjectivity. But every advance in testing, it seems, elicits the discovery of further flaws. This is the paradox of measurement: the more objective and precise we get, the more nimbly truth manages to keep a certain distance.

From the archives:

"If the GDP is Up, Why is America Down?" (October 1995)
Why we need new measures of progress, why we do not have them, and how they would change the social and political landscape. By Clifford Cobb, Ted Halstead, and Jonathan Rowe

Janet L. Norwood, who served as the nation's Commissioner of Labor Statistics for nearly an eternity (well, 12.62 years), once complained, "The real problem is that people often want a number to tell them everything." She was referring to disagreements over the validity of seemingly objective economic measures—the unemployment rate and the poverty rate, the Consumer Price Index and the gross domestic product—but she might as well have been referring to an immutable trait of human character.

By chapter six in the Book of Genesis, three chapters after the expulsion from Eden, human beings must confront the importance of being able to measure things. The Lord commanded Noah to build an ark: "This is how you are to make it: the length of the ark three hundred cubits, its breadth fifty cubits, and its height thirty cubits." To which Noah replied, as recorded on Bill Cosby's first album, "Right. What's a cubit?" (It is the length from the crook of the elbow to the furthest fingertip.) Ever since, measurement has been extended to more and more phenomena—the mass of an electron, the size of the universe—and has become more and more refined.

From the archives:

"Who Owns Intelligence?" (February 1999)
Three unresolved issues will dominate the discussion of intelligence: whether intelligence is one thing or many things; whether intelligence is inherited; and whether any of its elements can accurately be measured. By Howard Gardner

The number of standardized ways in which a typical person is measured, beginning now with prenatal screening and continuing up through such things as "emotional intelligence" assays and "360-degree" performance reviews, must run into the dozens. Measurements of electrical patterns in the amygdala, a region of the forebrain, may soon reveal our innermost emotions. An automobile designed by Toyota in cooperation with Sony, called the Pod and intended to help control road rage, contains sensors to measure pulse rate and level of perspiration; at the first sign of trouble it begins to play soothing music and warns drivers to calm down. (Memo to Toyota: this will only make them madder.) Measurements are palpated for the subtlest insights. A recent study conducted by clinicians in Toronto explored the relationship between high status and good health by comparing the longevity of Oscar winners with that of other actors and actresses. (Oscar winners live, on average, 3.9 years longer.) In another recent study researchers at the University of California at Berkeley randomly selected 720 people at street festivals in San Francisco, asked them about their sexual orientation, and measured their fingers. Lesbians, the researchers concluded, are more likely to have index fingers that are unusually short relative to their ring fingers.

Everyone knows that many types of measurement are at best crude constructs, especially when it comes to human psychology and well-being. But even the brute physical world remains mysteriously elusive. Fingerprints have been a bedrock of forensic evidence for decades—but the reliability of fingerprint analysis in some circumstances has lately been called into question. If anything ought to stay still long enough to be precisely measured, it is nature's physical "constants." But it turns out that our values for such things as the gravitational constant, the fine-structure constant, and even the speed of light may not be as solid as one would wish. "The constants of nature could be lawless variables," a physicist from London's Imperial College told a conference earlier this year.

In the everyday world, too, standard methods of measurement have been found to fall short. Last year the National Weather Service announced that the formula for the wind-chill factor had been somewhat inaccurate ever since it was adopted, in 1973, and that a new formula would be used in the future. Many meteorologists agree that the heat index, the wind-chill factor's warm-weather counterpart, could also benefit from remedial attention.

No one would argue for scrapping society's vast accumulated infrastructure of measurement. Indeed, there may be some new statistical indices we'd all be grateful to have. For instance, it would be easy enough to devise an accuracy-of-prognostication index for newspaper columnists, perhaps a little number that would appear right after the byline: "by Robert Novak (1.7)"; "by David S. Broder (7.6)." But at the same time, it might be worth giving subjective judgment more weight. Subjective judgment, after all, is what gives us epigrams. It is the methodology that informs such phrases as "gut reaction" and "cut of his jib." It is why NASA still uses noses rather than machines to decide which smells will prove intolerable in space. It often captures truth more fully than any measurement can.

As the College Board has shown, objective measures can easily be supplemented with something more individual and illuminating—namely, a short essay.

Imagine, say, that Albert Camus was a television meteorologist. After reporting the heat index, and maybe comically mopping his brow, he would tell us what being outside in such heat actually felt like:

I was surprised at how fast the sun was climbing in the sky. I noticed that for quite some time the countryside had been buzzing with the sound of insects and the crackling of grass. The sweat was pouring down my face ... The glare from the sky was unbearable. At one point, we went over a section of the road that had just been repaved. The tar had burst open in the sun. Our feet sank into it, leaving its shiny pulp exposed. [The Stranger]

Or the meteorologist could be Jack London. After he finished telling us the wind-chill factor, with an affable on-camera shudder and a sideways grin at the news anchor, he might also explain what that terrible degree of cold really did to a person:

It was surprising, the rapidity with which his cheeks and nose were freezing. And he had not thought his fingers could go lifeless in so short a time. Lifeless they were, for he could scarcely make them move together to grip a twig, and they seemed remote from his body and from him. When he touched a twig, he had to look and see whether or not he had hold of it. ["To Build a Fire"]

This approach could usefully augment many types of measured assessment—the latest unemployment figures, an electrocardiogram, a new estimate of the age of the universe. I put it forward with a certain hesitation, aware that novelty is too often mistaken for progress.

Cullen Murphy is The Atlantic's managing editor.
Jump to comments
Presented by

Cullen Murphy

Says Cullen Murphy, "At The Atlantic we try to provide a considered look at all aspects of our national life; to write, as well, about matters that are not strictly American; to emphasize the big story that lurks, untold, behind the smaller ones that do get told; and to share the conclusions of our writers with people who count."

Murphy served as The Atlantic Monthly's managing editor from 1985 until 2005, when the magazine relocated to Washington. He has written frequently for the magazine on a great variety of subjects, from religion to language to social science to such out-of-the-way matters as ventriloquism and his mother's method for pre-packaging lunches for her seven school-aged children.

Murphy's book Rubbish! (1992), which he co-authored with William Rathje, grew out of an article that was written by Rathje, edited by Murphy, and published in the December, 1989, issue of The Atlantic Monthly. In a feature about the book's success The New York Times reported that the article "was nominated for a National Magazine Award in 1990 and became a runaway hit for The Atlantic Monthly, which eventually ran off 150,000 copies of it." Murphy's second book, Just Curious, a collection of his essays that first appeared in The Atlantic Monthly and Harper's, was published in 1995. His most recent book, The Word According to Eve: Women and The Bible in Ancient Times and Our Own, was published in 1998 by Houghton Mifflin. The book grew out of Murphy's August 1993 Atlantic cover story, "Women and the Bible."

Murphy was born in New Rochelle, New York, and grew up in Greenwich, Connecticut. He was educated at Catholic schools in Greenwich and in Dublin, Ireland, and at Amherst College, from which he graduated with honors in medieval history in 1974. Murphy's first magazine job was in the paste-up department of Change, a magazine devoted to higher education. He became an editor of The Wilson Quarterly in 1977. Since the mid-1970s Murphy has written the comic strip Prince Valiant, which appears in some 350 newspapers around the world.

Get Today's Top Stories in Your Inbox (preview)

Where the Wildest Things Are

A government facility outside of Denver houses more than a million products of the illegal wildlife trade, from tigers and bears to bald eagles.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where the Wild Things Go

A government facility outside of Denver houses more than a million products of the illegal wildlife trade, from tigers and bears to bald eagles.

Video

Adults Need Playtime Too

When was the last time you played your favorite childhood game?

Video

Is Wine Healthy?

James Hamblin prepares to impress his date with knowledge about the health benefits of wine.

Video

The World's Largest Balloon Festival

Nine days, more than 700 balloons, and a whole lot of hot air

Writers

Up
Down

More in National

More back issues, Sept 1995 to present.

Just In