The Hidden Side of the Clinton Economy

The official government measures of unemployment and poverty disguise the fact that millions of Americans can't make a decent living
More


The hidden side of the Clinton economy

A TRIUMPHAL view dominates coverage of the longest peacetime recovery in U.S. history. The numbers tell the story: the lowest yearly unemployment in a quarter century, rising profits, a budget in balance, low interest rates, even lower inflation, and declining numbers of Americans classified as poor. All this is cause for rational exuberance. Allen Sinai, a noted economic analyst, has likened the times to a "worker heaven." But just as the buoyant Reagan economy of the 1980s masked seas of red ink, so the booming Clinton economy of the 1990s masks bad news. Relying on dubious measures that tell us good news, we have ignored the deepening erosion of the American Dream.

On the first Friday of every month the federal government announces the unemployment rate -- lately to much fanfare. A low rate signifies that American workers are able to take care of themselves, and that labor markets are tight and strong -- or so it is generally presumed. But is this really so? To most people the objective of employment is to earn a living. One's work is instrumental in achieving independence, self-sufficiency, and what some call competency. To the Founders, independence and competency meant that a person was able to earn a decent living through work. James Madison said that the happiest and most secure society was that in which the most citizens were independent. No republic could remain untroubled, he believed, if large numbers of citizens were economically marginalized. The self-evident truth that Thomas Jefferson proclaimed in his draft of the Declaration of Independence was that "all men are created equal and independent."
The principle of basic equality realized through economic independence inspired the Homestead Act. Enacted in 1862, it provided an opportunity for independence through grants of land sufficient to sustain a family to all who were willing to settle and work the land. Eighty-six years earlier Jefferson had proposed that the government of Virginia grant fifty acres of publicly owned land to any propertyless citizen willing to farm it. That everyone who is willing to work hard can make a decent living and get ahead is the American Dream.

The state of the American Dream, however, eludes measurement by official statistics that count workers as employed if they hold any job -- whether it is ten or forty hours a week; temporary, seasonal, or permanent; paying $7.00 or $70 an hour. In 1996 just over four million workers who were employed part-time said that they wanted to work full-time but could not find full-time jobs. Nearly 10 million worked full-time year-round but for less than $7.00 an hour. These two groups of workers, all of them counted as employed, amount to twice the seven million workers who held no job and were classified as unemployed. Together they total 21 million workers, not seven million.

Economists treat the unemployment rate as an indicator of aggregate pressure on the economy. But if many workers are underemployed, low unemployment does not signify a strong labor market that is likely to force up wages and hence generate inflation. That workers take and keep jobs paying menial wages is generally a mark of an aggregate undersupply of employment.


Shelter

What does low unemployment mean for the wages of the average American worker? From 1992 to 1998 the unemployment rate dropped by more than a third, yet the real hourly compensation of American workers remained virtually unchanged. Far from threatening to ignite inflation, workers' real wage increases have failed even to keep up with improvements in their productivity. Indeed, since 1973 the hourly compensation of workers would have to have grown by 24 percent more than it has (amounting to an increase in wages for the average full-time worker of more than $6,000 a year) just to match the gains that have taken place in worker productivity.

Opinion surveys asked a sample of Americans to estimate the rate of unemployment in 1996, when it stood at about 5.3 percent -- close to what many economists consider to be "full employment." The respondents' answers put the unemployment rate at about 20 percent, adding to concern among economists about the sorry state of public awareness about the economy. But perhaps the public was thinking of employment as the ability to be independent and earn a decent living. If so, the estimate of 20 percent of Americans without employment more accurately captured the real picture.

* * *

The unemployment rate is not the only gravely misleading statistic. Above the official poverty line, astonishingly, lie nearly half of all poor American families.

Historically, the poverty line established the household income required to afford basic necessities by contemporary norms. It was never intended to identify an income sufficient simply to stay alive. As a result the first federal poverty line was tied to the spending of the average American family, starting with the proportion of the family budget spent on food. In 1955, the year that was used as a basis for the first calculation, the average American family spent about a third of its budget on food -- whereas low-income families spent half or more. The formulation of the poverty line incorporated the smallest amount of money that a household would need to spend on food in order to provide adequate nutrition. This became known as the thrifty food budget. The government multiplied it by approximately three in order to arrive at the poverty line. In 1955 the poverty line for a family of four would have been about $2,700 a year. In 1964, when the government first officially reported the poverty line, it was adjusted, for a rise in food prices, to a bit under $3,200. It has been adjusted for inflation each year since.

Adjusting only for inflation, however, moored the poverty line thereafter to what households could afford on a 1955 budget. For example, the poverty line in 1994 (about $15,100 for a family of four) amounted, simply, to the number of 1994 dollars necessary to achieve the same material standard that $2,700 achieved in 1955. The problem with this standard is no quibble. Many of the expenses necessary for minimally decent living in most areas of the country today were not part of the lives of many Americans in the early 1950s. Even though food prices have risen since 1955 at practically the same rate as general inflation, food today accounts for barely a sixth of the average family budget, rather than a third. Continuing to use 1994 figures, let's reformulate the poverty line in the same way it was originally intended: given the Department of Agriculture's thrifty food budget of $4,576 for a family of four, and the proportion of the average family budget spent on food (about a sixth), the poverty line in 1994 should have been about $26,000 -- not $15,100.

Jump to comments
Presented by
Get Today's Top Stories in Your Inbox (preview)

Why Do People Love Times Square?

A filmmaker asks New Yorkers and tourists about the allure of Broadway's iconic plaza


Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Why Do People Love Times Square?

A filmmaker asks New Yorkers and tourists about the allure of Broadway's iconic plaza

Video

A Time-Lapse of Alaska's Northern Lights

The beauty of aurora borealis, as seen from America's last frontier

Video

What Do You Wish You Learned in College?

Ivy League academics reveal their undergrad regrets

Video

Famous Movies, Reimagined

From Apocalypse Now to The Lord of the Rings, this clever video puts a new spin on Hollywood's greatest hits.

Video

What Is a City?

Cities are like nothing else on Earth.

Writers

Up
Down
More back issues, Sept 1995 to present.

Just In