It is unusual nowadays to venture more than five minutes into any debate about the American economy—about widening income inequality, say, or threats to the country’s global competitiveness, or the squeeze on the middle class—without somebody invoking the great economic cure-all: education. We must improve it. For a moment, partisan passions subside and everybody nods.

But only for a moment. How, exactly, do we improve education? Where does the problem reside—in elementary schools, high schools, or colleges? Is the answer to recruit better teachers, or to get more students moving from high school to university? Should we spend more public money? Change the way schools are organized and paid for (supporting charter schools and vouchers, perhaps)? In no time, correctly orthogonal positions are laid down, and the quarreling resumes. But nobody challenges the importance of the issue. The centrality of education as a driver of the nation’s economic prospects appears beyond dispute. [See page 52 for excerpts from 150 years of Atlantic education coverage.]

Yet the connections between education and economics are not as they seem. To rest the case for improving schools and colleges largely on economic grounds is a mistake. It distorts education policy in unproductive ways. And though getting education right surely matters, more is at stake than a slight increase in economic growth.

Everybody understands that, as a rule of thumb, more school means a bigger paycheck. On average, having a college degree, rather than just a high-school degree, increases your earnings by about two-thirds. A problem arises, however, if you try to gross up these gains across the whole population. If an extra year of education equipped students with skills that increased their productivity, then giving everybody another year of school or college would indeed raise everybody’s income. But take the extreme case, and suppose that the extra year brought no gain in productive skills. Suppose it merely sorted people, signaling “higher ability” to a would-be employer. Then giving an extra year of school to everybody would raise nobody’s income, because nobody’s position in the ordering would change. The private benefit of more education would remain, but the social benefit would be zero.

From the archives:

"The Truth About Harvard" (March 2005)
It may be hard to get into Harvard, but it's easy to get out without learning much of enduring value at all. A recent graduate's report. By Ross Douthat

Would sending everybody to Harvard raise everybody’s future income by the full amount of the “Harvard premium”? Yes, if the value of a degree from Harvard resided in the premium skills you acquired there (and if the college’s classrooms could be scaled up a little). Well, ask any Harvard graduate about the teaching. The value of a degree from Harvard lies mainly in the sorting that happens during the application process. So the answer is no: if everybody went to Harvard, the Harvard premium would collapse.

In the case of an extra year of education, it need not be all or nothing; another year of study usually does impart some productivity-enhancing skill. But how much? A year of extra training in computer programming presumably has a direct material value. An extra year spent learning medieval history might improve a student’s intellectual self-discipline and ability to think analytically, but has lower material utility: nobody studies feudal land grants for the boost to lifetime earnings. So aggregated figures such as the proportion of high-school graduates going on to college—a number that is constantly cited and compared internationally—tell you very little.

Totting up college matriculations as a way of measuring national success is doubly ill-conceived if the signaling function flips over, so that a college education becomes the norm, and college nonattendance is taken to mean “unfit for most jobs.”

In 2004, 67 percent of American high-school graduates went straight on to college, compared with just under half in 1972. This is widely applauded. It looks like progress—but is it really? Failing to go to college did not always mark people out as rejects, unfit for any kind of well-paid employment. But now, increasingly, it does. In a cruel paradox, this may be one reason why parental incomes better predict children’s incomes in the United States than they used to—in other words, one reason why America is becoming less meritocratic. A college degree has become an expensive passport to good employment, one for which drive and ability less often can substitute, yet one that looks unaffordable to many poor families.

Many occupations are suffering from chronic entry-requirement inflation. Hotels, for instance, used to appoint junior managers from among the more able, energetic, and presentable people on their support or service staff, and give them on-the-job training. Today, according to the Bureau of Labor Statistics, around 800 community and junior colleges offer two-year associate degrees in hotel management. In hotel chains, the norm now is to require a four-year bachelor’s or master’s degree in the discipline.

For countless other jobs that once required little or no formal academic training—preschool teacher, medical technician, dental hygienist, physical-therapy assistant, police officer, para­legal, librarian, auditor, surveyor, software engineer, financial manager, sales manager, and on and on—employers now look for a degree. In some of these instances, in some jurisdictions, the law requires one. All of these occupations are, or soon will be, closed to nongraduates. At the very least, some of the public and private investment in additional education needs to be questioned.

To be sure, today’s IT-driven world is creating a genuine need for some kinds of better-educated workers. It is the shortage of such people, according to most politicians and many economists, that is causing the well-documented rise in income inequality. Both to spur the economy and to lessen inequality, they argue, the supply of college graduates needs to keep rising.

It seems plausible, but this theory too is often overstated, and does not fit the facts particularly well. The college wage premium rose rapidly for many years, up to the late 1990s. Since then it has flattened off, just when the pace of innovation would have led you to expect a further acceleration. An even more awkward fact is that especially in the past decade or so, rising inequality has been driven by huge income increases at the very top of the distribution. In the wide middle, where differences in educational attainment ought to count, changes in relative earnings have been far more subdued. During the 1990s, CEO salaries roughly doubled in inflation-adjusted terms. But median pay actually went up more slowly than pay at the bottom of the earnings distribution, and even pay at the 90th percentile (highly educated workers, mostly, but not CEOs) increased only a little faster than median wages. Today, shortages of narrowly defined skills are apparent in specific industries or parts of industries—but simply pushing more students through any kind of college seems a poorly judged response.

The country will continue to need cadres of highly trained specialists in an array of technical fields. In many cases, of course, the best place to learn the necessary skills will be a university. For many and perhaps most of us, however, university education is not mainly for acquiring directly marketable skills that raise the nation’s productivity. It is for securing a higher ranking in the labor market, and for cultural and intellectual enrichment. Summed across society, the first of those purposes cancels out. The second does not. That is why enlightenment, not productivity, is the chief social justification for four years at college.

Shoving ever more people from high school to college is not only of dubious economic value, it is unlikely to serve the cause of intellectual enrichment if the new students are reluctant or disinclined. Yet there are still large prizes to be had through educational reform—certainly in enlightenment and perhaps in productivity. They simply lurk farther down the educational ladder.

The most valuable attribute for young people now entering the workforce is adaptability. This generation must equip itself to change jobs readily, and the ability to retrain, whether on the job or away from the job, will be crucial. The necessary intellectual assets are acquired long before college, or not at all. Aside from self-discipline and the capacity to concentrate, they are preeminently the core-curriculum skills of literacy and numeracy.

Illiteracy has always cut people off from the possibility of a prosperous life, from the consolations of culture, and from full civic engagement. In the future, as horizons broaden for everybody else, people lacking these most basic skills will seem even more imprisoned. The most recent National Assessment of Adult Literacy found that 30 million adult Americans have less than basic literacy (meaning, for instance, that they find it difficult to read mail, or address an envelope). Three out of ten seniors in public high schools still fail to reach the basic-literacy standard. Progress on literacy would bring great material benefits, of course, for the people concerned and some benefits for the wider economy—but those benefits are not the main reason to make confronting illiteracy the country’s highest educational priority.

In addressing the nation’s assorted economic anxieties—over rising inequality, the stagnation of middle-class incomes, and the fading American dream of economic opportunity—educa­tion is not the longed-for cure-all. Nor is anything else. The debate about these issues will have to range all across the more bitterly disputed terrains of public policy—taxes, public spending, health care, and more. It is a pity, but in the end a consensus that blinds itself to the complexity of the issues is no use to anyone.