The Hidden Side of the Clinton Economy
The official government measures of unemployment and poverty disguise the fact that millions of Americans can't make a decent living

A TRIUMPHAL view dominates coverage of the longest peacetime recovery in U.S. history. The numbers tell the story: the lowest yearly unemployment in a quarter century, rising profits, a budget in balance, low interest rates, even lower inflation, and declining numbers of Americans classified as poor. All this is cause for rational exuberance. Allen Sinai, a noted economic analyst, has likened the times to a "worker heaven." But just as the buoyant Reagan economy of the 1980s masked seas of red ink, so the booming Clinton economy of the 1990s masks bad news. Relying on dubious measures that tell us good news, we have ignored the deepening erosion of the American Dream.
On the first Friday of every month the federal government announces the unemployment rate -- lately to much fanfare. A low rate signifies that American workers are able to take care of themselves, and that labor markets are tight and strong -- or so it is generally presumed. But is this really so? To most people the objective of employment is to earn a living. One's work is instrumental in achieving independence, self-sufficiency, and what some call competency. To the Founders, independence and competency meant that a person was able to earn a decent living through work. James Madison said that the happiest and most secure society was that in which the most citizens were independent. No republic could remain untroubled, he believed, if large numbers of citizens were economically marginalized. The self-evident truth that Thomas Jefferson proclaimed in his draft of the Declaration of Independence was that "all men are created equal and independent."
The principle of basic equality realized through economic independence inspired the Homestead Act. Enacted in 1862, it provided an opportunity for independence through grants of land sufficient to sustain a family to all who were willing to settle and work the land. Eighty-six years earlier Jefferson had proposed that the government of Virginia grant fifty acres of publicly owned land to any propertyless citizen willing to farm it. That everyone who is willing to work hard can make a decent living and get ahead is the American Dream.
The state of the American Dream, however, eludes measurement by official statistics that count workers as employed if they hold any job -- whether it is ten or forty hours a week; temporary, seasonal, or permanent; paying $7.00 or $70 an hour. In 1996 just over four million workers who were employed part-time said that they wanted to work full-time but could not find full-time jobs. Nearly 10 million worked full-time year-round but for less than $7.00 an hour. These two groups of workers, all of them counted as employed, amount to twice the seven million workers who held no job and were classified as unemployed. Together they total 21 million workers, not seven million.
Economists treat the unemployment rate as an indicator of aggregate pressure on the economy. But if many workers are underemployed, low unemployment does not signify a strong labor market that is likely to force up wages and hence generate inflation. That workers take and keep jobs paying menial wages is generally a mark of an aggregate undersupply of employment.

What does low unemployment mean for the wages of the average American worker? From 1992 to 1998 the unemployment rate dropped by more than a third, yet the real hourly compensation of American workers remained virtually unchanged. Far from threatening to ignite inflation, workers' real wage increases have failed even to keep up with improvements in their productivity. Indeed, since 1973 the hourly compensation of workers would have to have grown by 24 percent more than it has (amounting to an increase in wages for the average full-time worker of more than $6,000 a year) just to match the gains that have taken place in worker productivity.
Opinion surveys asked a sample of Americans to estimate the rate of unemployment in 1996, when it stood at about 5.3 percent -- close to what many economists consider to be "full employment." The respondents' answers put the unemployment rate at about 20 percent, adding to concern among economists about the sorry state of public awareness about the economy. But perhaps the public was thinking of employment as the ability to be independent and earn a decent living. If so, the estimate of 20 percent of Americans without employment more accurately captured the real picture.
The unemployment rate is not the only gravely misleading statistic. Above the official poverty line, astonishingly, lie nearly half of all poor American families.
Historically, the poverty line established the household income required to afford basic necessities by contemporary norms. It was never intended to identify an income sufficient simply to stay alive. As a result the first federal poverty line was tied to the spending of the average American family, starting with the proportion of the family budget spent on food. In 1955, the year that was used as a basis for the first calculation, the average American family spent about a third of its budget on food -- whereas low-income families spent half or more. The formulation of the poverty line incorporated the smallest amount of money that a household would need to spend on food in order to provide adequate nutrition. This became known as the thrifty food budget. The government multiplied it by approximately three in order to arrive at the poverty line. In 1955 the poverty line for a family of four would have been about $2,700 a year. In 1964, when the government first officially reported the poverty line, it was adjusted, for a rise in food prices, to a bit under $3,200. It has been adjusted for inflation each year since.
Adjusting only for inflation, however, moored the poverty line thereafter to what households could afford on a 1955 budget. For example, the poverty line in 1994 (about $15,100 for a family of four) amounted, simply, to the number of 1994 dollars necessary to achieve the same material standard that $2,700 achieved in 1955. The problem with this standard is no quibble. Many of the expenses necessary for minimally decent living in most areas of the country today were not part of the lives of many Americans in the early 1950s. Even though food prices have risen since 1955 at practically the same rate as general inflation, food today accounts for barely a sixth of the average family budget, rather than a third. Continuing to use 1994 figures, let's reformulate the poverty line in the same way it was originally intended: given the Department of Agriculture's thrifty food budget of $4,576 for a family of four, and the proportion of the average family budget spent on food (about a sixth), the poverty line in 1994 should have been about $26,000 -- not $15,100.
Even this amount would provide no more than the barest necessities. To the thrifty food budget of $4,576 add the cost of a two-bedroom apartment for four people that meets the government's definition of a low-cost rental unit, including the minimum charge for utilities and local telephone service: $512 a month, or $6,144 a year. Add the cost of operating, insuring, and keeping in working repair a single ten-year-old car, along with a little money for public transportation: $3,700 a year. Add no more than $250 for each family member for clothing; $2,000 for all medical, pharmaceutical, and dental expenses; personal and household expenses of $3,900 for everything from toothpaste to household repairs; and, finally, federal income, Social Security, and state income taxes totaling $3,270. These frugal expenditures come to nearly $25,000 -- with no provision for entertainment, vacations, child care, emergencies, or savings. No wonder respondents to a 1994 Roper survey thought that about $25,000 was the lowest income required for a family of four just to get by.
In 1955 the official poverty-line income for a family of four would have stood at 59 percent of the median income for all married-couple families; by 1994 it had dropped to 33 percent. In other words, whereas the poverty line originally measured the poor, it has come to measure the very poor. Imagine a family of four living on $15,100 (almost 40 percent less than the $25,000 budget just described) trying to find decent housing with no more than about $300 a month for rent and utilities combined, having no more than $52 a week to spend on food for four people -- sixty-three cents a meal per family member.

Instead of measuring the very poor, the poverty line should be restored to its intended function -- defining the level above which a household that spends carefully can afford necessities within current norms. Practicing the work ethic should enable people to achieve this standard of living. It describes the promise of the American Dream. In 1994 there were more than 65 million Americans living in households unable to attain this standard. But the federal government reported only 38 million as poor. Since 1972, by the poverty measure as it was originally intended, the proportion of Americans who are poor has grown from approximately 17 percent to more than 25 percent, despite the sizable growth over that period in the proportion of all adults who have joined the labor force and taken jobs.
The connection in the everyday world between practice of the work ethic and reward has broken down, virtually unnoticed, during the past two generations. In the early 1950s the minimum wage paid about 110 percent of the amount required for a household with two full-time workers and children to get along. Full-time work at the minimum wage brought entry into the bottom of the middle class. By the 1970s the minimum wage for two workers had dropped to 90 percent of the necessary income. Now it provides less than 70 percent. Today more than 12 million full-time year-round workers are paid wages beneath those needed to support a minimally decent standard of living for households with children. Two thirds of workers who start at subpar wages are unable to lift themselves to a decent wage even after a decade of continuous full-time work. Likewise, many millions of other workers in part-time jobs because they can't find full-time work also fall short.
During the past half century a new notion of the American Dream has taken hold among us. The belief that anyone who is willing to work and persevere can live decently has allowed us to assume that people with insufficient incomes are not deserving -- perhaps not even if they work hard. In place of the precept that people who practice the work ethic will succeed, the nation has moved close to accepting a paradigm of meritocracy, on all rungs of the income ladder. However intuitively appealing, meritocracy has a social Darwinist side. Building on the idea of rewarding ability, talent, and knowledge, it defends or at the very least excuses the existence of low-paying jobs on the grounds that people who cannot make the grade don't deserve a decent wage. This philosophy is exclusive, not inclusive. Whereas everybody can develop a strong moral character, not everybody has the ability, talent, cleverness, or knowledge that the market may demand. False official measures hide the reality that for large and increasing numbers of Americans, the economic foundation necessary to support a family, the morality of individual responsibility, and the work ethic is disappearing.
John E. Schwarz is the author of (1997).
Illustrations by Maris Bishofs
The Atlantic Monthly; October 1998; The Hidden Side of the Clinton Economy; Volume 282, No. 4; pages 18 - 21.