Graphics by Nigel Holmes
The Millennial Edition of Historical Statistics of the United States was published last year—the first significant revision to our national numeric archive since the Bicentennial Edition a generation ago.
Creating this new edition was more than a matter of adding births, personal-computer sales, cable-TV packages, and cell-phone subscribers, while subtracting everything dead or outsourced to China. It seems that fully three-quarters of all facts and figures from America federal, state, and local authorities have been generated since 1970. Likewise with about 80 percent of scholarly historical calculations. (The rise of the Internet and the fall in the price of pocket calculators turn out to be pretty damn historic, statistically speaking.)
Indeed, faced with an undertaking so expensive and complex, the United States government itself—an organization notoriously fond of cost and complication—said thanks, but no second helping for us. The project was taken on instead by Cambridge University Press, with the cooperation of the U.S. Census Bureau and scholars from, among other places, the Economic History Association, the Social Science History Association, and the Cliometric Society. (Clio is the muse of history, and her measurements, I’m led to believe, are va-va-voom stuff when she lets down her hair and takes off those glasses.) Eighty-three professors provided essays, analyses, and critiques. The list of research assistants and supplementary experts goes on for three pages. In case you’re still unimpressed by the magnitude of the enterprise, let me put it another way: The Millennial Edition sells for $1,100.
So this is an important research work. After all, we cannot, as a nation, know where we’re going unless we know where we’ve been, even if I’m not exactly sure what I mean by that statement. Should we barrel down the highway to the future with our eyes glued to the rearview mirror? Should we keep in mind that if we forget history, we’ll end up repeating it, per that famous quote of Santayana’s? Yet consider the way history is taught these days: A survey of history students probably would show that they admire Santayana’s genius but hope his next CD won’t be just a remake of Supernatural with different vocalists.
Excuse me if I’m sounding a bit cracked and warped here. The Millennial Edition of Historical Statistics arrived at my office a fortnight ago weighing in at 25 pounds in five volumes totaling 5,286 pages and containing 37,339 data series quantifying everything that’s got quantity from the earliest Colonial period until the end of the 20th century. Wholesale price of New England codfish in 1634: 10½ shillings the hundredweight. Number of organized bowling teams in 1996: 1,023,785.
This would be enough to bend anyone’s mind. And it’s bent my mind the way it’s bent my IKEA bookshelves. I am a writer. I spend my days kneeling in the muck of language, feeling around for gooey verbs, nouns, and modifiers that I can squash together to make a blob of a sentence that bears some likeness to reason and sense. Imagine my ecstasy when I come across a hard, clean, bright, shiny number. Behold this gem of precision, perfect in its clarity and radiating mathematical reasonableness and arithmetical sensibility in every direction. I am free at last from the slime of words. Are shadows stretching their spectral arms to embrace the decline of day? Do vespers sound their quotidian knell? Does the gloaming echo with prelude to the nightingale’s descant? No! The sun sets at 7:56, and shut up.
Historical Statistics is an Aladdin’s cave full of such exactitudes. The size of its treasure trove virtually guarantees that it contains an answer—a dazzling, vivid integer of an answer—to every question about America and how it got that way. And I’ve spent the past two weeks plunging my arms deep into the Millennial Edition, then raising my hands high above my head and letting the diamonds, rubies, and pearls of enumeration dribble between my fingers while I cackle madly with glee.
As I was saying, Historical Statistics is a valuable source of primary data concerning a broad range of material and social conditions constituting the American experience. For example, how come our public schools used to be so good, but now they stink? The answer is right here in tables Aa625 and Aa626: spinsters.
Once there were women of a certain age with steel in their spine, ice in their gaze, buns in their hair, and remarkable reflexes for smacking knuckles with a ruler. They led three generations of bonehead O’Rourkes through the rowdy wilderness of crowded big-city schools and into at least a shallow pool of learning. Miss Prescott may have been a terror, but I didn’t emerge from her American-literature class under the misapprehension that the philosopher, poet, and author of The Last Puritan, Santayana, performed at Woodstock.
According to nuptiality records, in the last decades of the 19th century the proportion of women 65 and older who had never married was slightly above 6 percent. But by 1920, just in time for my father to go to high school, this number had risen to 7.08 percent. Dad very nearly graduated. In 1930, the spinster percentage was 8.11 percent. Thus my Uncle Mikey-Mike could count to five without removing his mittens. By 1940, the rate was 9.31 percent, and all of my older cousins acquired job skills enough to keep them from going on welfare. (They’ve been in jail occasionally, yes, but not on welfare.) And I more or less attended college, owing to the fact that by the time I started kindergarten, the spinster rate was still almost 9 percent. Competition at my school was fierce. Only the toughest of the tough could get the prize job of being the old lady who scared the bejesus out of me.
It’s all been downhill since then, alas, with spinsterhood plummeting to 5.49 percent by 1990 (when there weren’t really any spinsters anymore anyway—just strong women without concern for conformist social pressures, leading their own lives and owning too many cats).
During the golden age of American public schools, teachers also had the advantage of a level of educational funding very different from today’s. Table Bc925 shows that, in constant 1982–1984 dollars, annual per-pupil spending in public schools was $4,090 in 1996 (the last year for which figures are given). By contrast, in 1921, when my father was a high-school freshman, annual per-pupil spending (in the same constant dollars) was $399. Of course my father got a better education: With $399, you can’t afford to let the kids do anything but sit and study. Dad was forbidden to squirm, for fear he’d wear out a precious desk seat.
Americans may have gotten stupider in the late 20th century, but at least they were eating sensibly. We can see Americans becoming more lean and fit in tables Bd568 through Bd580, which detail per capita food consumption from 1970 to 1995. The amount of red meat on the average American’s plate declined from 192.4 pounds a year to 166.6.
Meanwhile, the intake of healthy and slimming fruits and vegetables rose from 564.4 pounds to 685.9 (if you count both fresh and processed). True, table Bd598 shows that between 1970 and 1994, Americans, on average, increased their daily intake of kilocalories from 3,300 to 3,800. But kilocalories are those European-style, metric-system calories that keep the French so thin.
The real reason Americans grew to the size of mastodons between the willowy days of Pat Nixon and the XXXL-thong Lewinsky years is to be found in tables Bd584 and Bd585. Annual consumption of bottled water and carbonated beverages went from 32 gallons per person in 1976 to 62.8 gallons in 1995. We’re not fat. But we are about to burst due to fluid retention. And since 51.2 of those 62.8 gallons of liquid are the fizzy stuff, it’s very important never to shake or drop an American or poke one with a sharp object. The result would be a mess worse than Watergate and the Clinton impeachment put together.
Speaking of which, do you know what causes low voter turnout in America? It’s the result of having the fate of our nation at stake. This began with the bitter presidential election of 1828, which pitted the education, cultivation, and puritan constraint of John Quincy Adams against the yahoo populism of Andrew Jackson, thereby deciding permanently whether America would become a shining city upon a hill or an overlighted strip mall along a highway. Voter turnout that year was 55.2 percent. A dozen years later, a small and unctuous incumbent, Martin Van Buren, the first professional politician to occupy the White House, ran against the vacuous William Henry Harrison, who would die from the pneumonia he contracted by giving an overlong inauguration speech in the freezing rain. Harrison’s platform consisted entirely of the slogan “Tippecanoe and Tyler Too.” Van Buren’s platform was even less substantive. There were no issues of note. And voter turnout was 77.5 percent.
In 1860, when a vote for or against Abraham Lincoln meant deciding whether to fight a civil war, 72.1 percent of eligible voters went to the polls. In 1876, when a vote for or against Rutherford B. Hayes meant bubkes, 82.9 percent of eligible voters showed up.
In 1932, with Republicans and Democrats offering radically different political and economic responses to the Great Depression, voter turnout was 56.8 percent. In 1940, with the reelection of FDR a foregone conclusion, turnout was 62.9 percent.
Another way to guarantee that a lower percentage of eligible voters will exercise the right to their franchise is to guarantee their franchise rights. Voter turnout in the presidential election of 1916 was 61.9 percent. Then, in 1920, the 19th Amendment was ratified, giving the vote to women. Voter turnout in that year’s presidential election was 49.2 percent. The Voting Rights Act, ensuring access to the polls for blacks, was passed in 1965. Voter turnout went from 63.3 percent in 1964 to 62.5 percent in 1968. And after the voting age was lowered to 18, in 1971, voter turnout took a further dip, to 56.4 percent in the 1972 presidential election.
Extrapolating from the trend lines evident in Historical Statistics, we see that if one of the 2008 presidential candidates is a vicious moron (entirely possible) and the other is a beneficent genius (not as likely), and all life on Earth is threatened because al-Qaeda has discovered a way to poke every American with a sharp object simultaneously (could happen), and we extend the franchise to absolutely everyone, including preschoolers, citizens of the EU, illegal aliens, space aliens, and household pets (probably resulting in a better-informed electorate), we could achieve a voter turnout of zero.
That’s just the pleasant daydream of a tired, cynical writer in dread of looming deadlines for tired, cynical commentary on the 2008 presidential hopefuls. A William Henry Harrison of an inauguration to the lot of them.
But never mind—Historical Statistics displays plenty of other cheerful trends. We all became fabulously wealthy last century, so much so that the 1999 U.S. Census Bureau’s poverty threshold for a family of four ($10,221.49 in constant 1982–1984 dollars) was almost exactly equal to the compensation for a full-time worker back in 1941 ($10,360.54 in the same constant dollars). We got so rich that regular folks with ordinary jobs were somehow transformed into victims of soul-crushing poverty, the objects—at best—of our charitable contempt.
Then there’s the “Here’s your hat, what’s your hurry?” graph, figure Ae-B, showing how we got our crabby old parents out of our homes and into the Heaven’s Gate Elder-Care Facility. The vertical axis of the graph measures the percentage of codgers and crones. The horizontal axis is a time line from 1850 to 1990. On the graph is a line labeled “Living with own children” and a line labeled “Living alone or with spouse only.” Move your finger along the time line to 1935, when Congress passed the Social Security Act. Then move it a little more to the right to allow for the benefits to kick in and the wartime housing shortage to abate. Now move it halfway up the graph—X marks the spot! You’ve put your finger on a total sociological inversion. The living-alone line soars—a rocket probe into the deep space of solitary senility. The living-with-children line drops like a George W. Bush approval poll. In 1850, about 13 percent of people over 65 were living alone or with just their spouse, and about 70 percent were living with their children. By 1990, the percentages were reversed almost exactly.
What sighs of relief must have been expelled along with the seniors as spare rooms were aired, ancient toy poodles were euthanized, radios and TVs were tuned to bearable programming at endurable volumes, and the last vestiges of clan and tribal obligation were dumped in the trash, along with the renewal notice for the large-print edition of Reader’s Digest.
Throwing the old folks out of the house seems to have made us so happy that pretty soon, in pursuit of further happiness, we began throwing each other out of the house as well. Divorce rates doubled between the early 1960s and the middle 1990s. And one of the demographers who contributed an essay to Historical Statistics wrote that about half of all the marriages contracted at the end of the 20th century would end in divorce, and that about a third of the babies being born had unwed mothers. Putting all that together with a continuing decline in the fertility rate among native-born women, we can conclude that the final years of the last century were marked by a trend toward smaller and smaller—and, therefore, happier and happier—households. However, the suicide rate held steady, and even dropped a bit at the end of the 1990s, so Americans did not take the final step in making their households as absolutely small and perfectly happy as possible. But, as with the Social Security Act, the right government program may be able to remedy this.
On another upbeat note, as a longtime smoker, I’m glad to say that Historical Statistics allows me to conclude that stopping smoking causes lung cancer. Practically all Americans claim to have stopped smoking back in the 1970s, as they inevitably tell you every time they bum a cigarette. And per capita cigarette consumption did drop steadily from 1973 to 1994. But during the same period, the incidence of lung and bronchial cancer rose from 42.5 per 100,000 people to 57.1 per 100,000.
Doing a little additional fooling around with the numbers, I can also conclude that nobody stopped smoking at all. Per capita cigarette smoking was higher in 1994 than it was in 1941, and if you watch 1940s movies, you know that back then everybody smoked. In 1994, the number of cigarettes smoked divided by the number of Americans over 18 equaled about 123 packs a year. So if it’s only me who’s still smoking, plus you every so often—but just socially—then the two of us are spending one hell of a lot of time outside the back door of the office building in all kinds of weather.
There’s really no end to the things that one can conclude from these five volumes. Had I but world enough and time—and if the dog hadn’t chewed my cheap, flimsy pocket calculator—I could prove, disprove, re-prove, and improve more things about America than America has things. And America has a lot of things. Of course it probably helps that my education in mathematics ended with a Little Bighorn experience involving Chief Soh Cah Toa in sophomore trig. And I’ve never taken a statistics course. I don’t know a standard deviation from a deviation that would condemn me to eternal perdition, and I can’t tell a bell curve from Ms. Clio’s bustline. But all this is equally true of most readers. So I think I could pull it off.
Did you know that the percentage of young people in the crucial “youthquake” age bracket of 15 to 24 was higher in 1973 (18.5 percent) than in 1967 (16.7 percent)? Therefore it was glam rock that ended the war in Vietnam.
Mark Twain quoted the British Prime Minister Benjamin Disraeli as saying, “There are three kinds of lies: lies, damned lies and statistics.” That in itself was either a collation of official data or a fib, since there’s no record of Disraeli saying any such thing. This goes to show we should take Twain’s statement in the spirit in which it was made. Who more than Samuel Clemens was fond of a very, very tall tale? Imagine how the Celebrated Jumping Frog of Calaveras County would have performed if he’d been filled with the contents of the Millennial Edition of Historical Statistics of the United States.