For a parent drowning in glossy college mailings, a college admissions officer deluged with applications, or a student padding a résumé with extracurricular activities, it's easy to see applying to college as a universal American rite of passage—a brutal and ecumenical process that ushers each generation of stressed-out applicants into the anteroom of adulthood. But for many American teenagers the admissions process is something else entirely—a game that is dramatically rigged against them, if they even play it. In a country where a college degree is a prerequisite for economic and social advancement, rich and upper-middle-class students can feel secure about their chances. They may not have the grades or the good fortune to attend their first-choice schools, but they're still likely to be admitted to a college that matches their interests and ambitions reasonably well. For those further down the socioeconomic ladder, though, getting in is hard, and getting through can be even harder.
Native intelligence and academic achievement do lift many poor students into college. But especially where elite colleges are concerned, students from well-off families have a big advantage. The figures are stark. If you hope to obtain a bachelor's degree by age twenty-four, your chances are roughly one in two if you come from a family with an annual income over $90,000; roughly one in four if your family's income falls between $61,000 and $90,000; and slightly better than one in ten if it is between $35,000 and $61,000. For high schoolers whose families make less than $35,000 a year the chances are around one in seventeen.
This is not how the modern meritocracy was supposed to work. American higher education was overhauled in the middle years of the twentieth century to be a force for near universal opportunity—or so the overhaulers intended. The widespread use of the SAT would identify working-class kids with high "scholastic aptitude," as the initialism then had it (since 1994 the SAT has been for "scholastic assessment"), and give them the academic chances they deserved. Need-based financial aid and government grants would ensure that everyone who wanted a college education could afford one. Affirmative action would diversify campuses and buoy disadvantaged minorities.
Part of this vision has come to pass. Minority participation in higher education has risen since the 1960s, and college campuses are far more racially and ethnically diverse today than they were half a century ago. But the socioeconomic diversity that administrators assumed would follow has failed to materialize. It's true that more low-income students enroll in college now than in the 1970s—but they are less likely to graduate than their wealthier peers. Through boom and recession, war and peace, the proportion of the poorest Americans obtaining college degrees by age twenty-four has remained around six percent.
This is not something that most colleges like to discuss—particularly elite schools, which have long taken pride in their supposed diversity. But the idea that the meritocracy isn't working is gaining currency among observers of higher education. It's visible in recent high-profile changes in the financial-aid policies of such schools as Harvard, Princeton, and the University of Virginia; as a thread of disquiet running through the interviews this magazine has conducted with admissions officers over the past two years; and as the unpleasant but undeniable conclusion of a number of new studies.
The most prominent of these studies was headed by William Bowen, a former president of Princeton, who since leaving that office, in 1988, has produced a series of weighty analyses of college admissions—on the consequences of racial preferences, the role of athletics, and, most recently, the question of socioeconomic diversity. In the recently published book Equity and Excellence in American Higher Education, Bowen and his co-authors use detailed data from the 1995 entering class at nineteen selective schools—five Ivies, ten small liberal arts colleges, and four flagship state universities—to argue that elite universities today are as much "bastions of privilege" as they are "engines of opportunity." Only six percent of the students at these schools are first-generation collegians; only 11 percent of the graduates come from families in the country's bottom economic quartile. The picture is even worse in another recent study. The education expert Anthony Carnevale and the economist Stephen Rose surveyed 146 top colleges and found that only three percent of their students came from the bottom economic quartile of the U.S. population—whereas 74 percent came from the top one.
At the very least, the persistence of this higher-education gap suggests that the causes of the decades-old growth in economic inequality are deeper than, say, tax cuts or the ebb and flow of the stock market. Inequality of income breeds inequality of education, and the reverse is also true: as long as the financial returns on a college degree continue to rise, the upper and upper-middle classes are likely to pull further away from the working and lower classes.
The United States still leads most countries by a considerable margin in proportion of the population with a college degree (27 percent). But when the sample is narrowed to those between the ages of twenty-five and thirty-four, we slip into the pack of industrialized nations, behind Canada, Japan, and five others. Further, the U.S. college-age population is swelling (it will increase by about 3.9 million during this decade, according to one estimate), with much of the growth occurring among low-income Hispanics, one of the groups least likely to attend college. Educating this population is an enormous challenge—one that we are unprepared to meet.
The obvious culprits are the universities, which have trumpeted their commitment to diversity and equal access while pursuing policies that favor better-off students. Not only is admitting too many low-income students expensive, but it can be bad for a school's rankings and prestige—and in the long run prestige builds endowments.
The current arms race for higher rankings began in earnest in the early 1980s, when the post—Baby Boom dearth of applicants sent colleges, both public and private, scrambling to keep tuition revenue coming in. It has been sustained by anxious Boomer parents, by the increasing financial advantages of a college degree, by cutbacks in government aid, and by magazines eager to make money from ranking America's top schools. The rankings rely on statistics such as average SAT scores, alumni giving, financial resources, and graduation rates. Attracting students with high scores and high family incomes offers the biggest gains of all. (See Matthew Quirk's "The Best Class Money Can Buy," page 128.)
Meanwhile, the admissions process is strewn with practical obstacles for low-income students. Early-admissions programs, for instance, which James Fallows has discussed in these pages (see "The Early-Decision Racket," September 2001 Atlantic), offer many benefits to applicants, but they almost exclusively help wealthy students, whose parents and guidance counselors are more likely to have the resources to take advantage of them. Poorer students are also less likely to know about the availability of financial aid, and thus more likely to let "sticker shock" keep them from applying in the first place. And a poor student put on a waiting list at a selective school is less likely than a well-to-do student to be accepted, because often a school has exhausted its financial-aid budget before it turns to the list.
In this scramble selectivity is "the coin of the realm," as one admissions officer put it to The Atlantic last year. More and more schools define themselves as "selective" in an effort to boost their position and prestige, and fewer and fewer offer the kind of admissions process that provides real opportunities for poorer students. As a result, those disadvantaged students who do attend college are less and less likely to find themselves at four-year schools. Among students who receive Pell Grants—the chief need-based form of federal assistance—the share attending four-year colleges fell from 62 percent in 1974 to 45 percent in 2002; the share attending two-year schools rose from 38 percent to 55 percent.
The advantage to well-off students is particularly pronounced at private colleges and universities. Over the course of the 1990s, for instance, the average private-school grant to students from the top income quartile grew from $1,920 to $3,510, whereas the average grant to students from the lowest income quartile grew from $2,890 to $3,460. And for all the worry of the middle class over rising tuition, increases in grant dollars often outstrip increases in tuition costs for middle- and upper-income students—but not for their poorer peers. In the second half of the 1990s, a study by the Lumina Foundation (a higher-education nonprofit) found, families with incomes below $40,000 received less than seventy cents in grants for every dollar increase in private-college tuition. All other families, including the richest, received more than a dollar in aid for every dollar increase in tuition.
It isn't just schools that have moved their aid dollars up the income ladder. State and federal governments have done the same. Since the 1980s public funds have covered a shrinking share of college costs, and with entitlements claiming an ever growing chunk of state and federal budgets, the chance of a return to the free-spending 1970s seems remote. But even when higher-education outlays have increased—they did during the 1990s boom years, for instance—government dollars have been funneled to programs that disproportionately benefit middle- and upper-income college students.
Both colleges and states have increasingly invested in "merit-based" scholarships, which offer extra cash to high-performing students regardless of need; these programs are often modeled on Georgia's HOPE scholarship, established in 1993 and funded by a state lottery, and thus amount to a form of regressive taxation. The federal government, meanwhile, has used tax credits to help parents defray the cost of college—a benefit that offers little to low-income families. Pell Grants have been expanded, but the purchasing power of individual grants hasn't kept pace with rising tuition.
Overall, American financial aid has gradually moved from a grant-based to a loan-based system. In 1980, 41 percent of all financial-aid dollars were in the form of loans; today 59 percent are. In the early 1990s Congress created a now enormous "no-need" loan program; it has been a boon for upper-income students, who can more easily afford to repay debts accrued during college. At the same time, the federal government allowed families to discount home equity when assessing their financial circumstances, making many more students eligible for loans that had previously been reserved for the poorest applicants. The burdens associated with loans may be part of the reason why only 41 percent of low-income students who enter four-year colleges graduate within five years, compared with 66 percent of high-income students.
All these policy changes have been politically popular, supported by Democratic and Republican politicians alike. After all, the current financial-aid system is good for those voters—middle-class and above—who already expect to send their kids to college, and who are more likely to take the cost of college into consideration when they vote. And though Americans support the ideal of universal educational opportunity, they also support the somewhat nebulous notion of merit and the idea that a high SAT score or good grades should be rewarded with tuition discounts—especially when it's their children's grades and SAT scores that are being rewarded.
But it's not enough to blame the self-interest of many universities or the pandering of politicians for the lack of socioeconomic diversity in higher education. There's also the uncomfortable fact that a society in which education is so unevenly distributed may represent less a failure of meritocracy than its logical endpoint.
That the meritocracy would become hereditary was the fear of Michael Young, the British civil servant who coined the term. His novel The Rise of the Meritocracy (1958)—written in the form of a dry Ph.D. thesis that analyzed society from the vantage point of 2034—envisions a future of ever more perfect intelligence tests and educational segregation, in which a cognitive elite holds sway until the less intelligent masses rise to overthrow their brainy masters. A scenario of stratification by intelligence was raised again in 1971, in these pages, by the Harvard psychologist Richard Herrnstein, and in 1994 by Herrnstein and Charles Murray, in their controversial best seller The Bell Curve. That book is now remembered for suggesting the existence of ineradicable racial differences in IQ, but its larger argument was that America is segregated according to cognitive ability—and there's nothing we can do about it.
Today Young's dystopian fears and The Bell Curve's self-consciously hardheaded realism seem simplistic; both reduce the complex questions of merit and success to a matter of IQ, easily tested and easily graphed. The role that inherited intelligence plays in personal success remains muddy and controversial, but most scholars reject the "Herrnstein Nightmare" (as the journalist Mickey Kaus dubbed it) of class division by IQ.
It doesn't really matter, though, whether our meritocracy passes on success genetically, given how completely it is passed on through wealth and culture. The higher one goes up the income ladder, the greater the emphasis on education and the pressure from parents and peers to excel at extracurricular achievement—and the greater the likelihood of success. (Even the admissions advantage that many schools give to recruited athletes—often presumed to help low-income students—actually tends to disproportionately benefit the children of upper-income families, perhaps because they are sent to high schools that encourage students to participate in a variety of sports.) In this inherited meritocracy the high-achieving kid will not only attend school with other high achievers but will also marry a high achiever and settle in a high-achieving area—the better to ensure that his children will have all the cultural advantages he enjoyed growing up.
Powerful though these cultural factors are, change is possible. The same studies that reveal just how class-defined American higher education remains also offer comfort for would-be reformers. Certainly, policies that strengthen families or improve elementary education undercut social stratification more effectively than anything colleges do. For now, however, numerous reasonably prepared students—300,000 a year, by one estimate—who aren't going to college could be. And many students who are less likely than their higher-income peers to attend the most selective schools would thrive if admitted.
The obvious way to reach these students is to institute some sort of class-based affirmative action—a "thumb on the scale" for low-income students that is championed by Bowen and by Carnevale and Rose in their analyses of educational inequality. Many elite universities claim to pursue such policies already, but Bowen's study finds no admissions advantage for poor applicants to the selective schools in the sample simply for being poor. In contrast, a recruited athlete is 30 percent more likely to be admitted than an otherwise identical applicant; a member of an under-represented minority is 28 percent more likely; and a "legacy" (alumni child) or a student who applies early is 20 percent more likely.
As an alternative Bowen and his co-authors propose that selective schools begin offering a 20 percent advantage to low-income students—a policy with "a nice kind of symbolic symmetry" to the advantage for legacies, they point out. By their calculations, this would raise the proportion of low-income students at the nineteen elite schools in their sample from 11 to 17 percent, without much impact on the schools' academic profiles.
Class-based affirmative action has an obvious political advantage: it's more popular with the public than race-based affirmative action. (Bowen envisions socioeconomic diversity as a supplement to racial diversity, not a replacement.) Increasing socioeconomic diversity might offer something to both sides of the red-blue divide—to a Democratic Party rhetorically committed to equalizing opportunity, and to a Republican Party that increasingly represents the white working class, one of the groups most likely to benefit from having the scales weighted at elite universities.
But however happy this may sound in theory, one wonders how likely schools are to adopt class-based preferences. As Carnevale and Rose put it, doing so "would alienate politically powerful groups and help less powerful constituencies"; Bowen notes that it would reduce income from tuition and alumni giving. A selective school might court backlash every time it admitted a poor kid with, say, a middle-range SAT over an upper-middle-class kid with a perfect score. It's doubtful that many colleges would be willing to accept the losses—and, for the more selective among them, the possible drop in U.S. News rankings.
Even the elite of the elite—schools like the nineteen examined in Bowen's book, which are best able to afford the costs associated with class-based affirmative action—seem more inclined to increase financial aid than to revamp their admissions policies with an eye toward economic diversity. In the past several years schools like Harvard, Princeton, and Brown have shifted financial-aid dollars from loans to grants, helping to ensure a free ride for the neediest students once they get in. Such gestures make for good public relations, and they do help a few students—but they don't make it easier for low-income students to gain admission.
The benefits and the limitations of moving from loans to grants can be observed in the "AccessUVa" program at the University of Virginia, one of the schools in Bowen's sample. In 2003 it had a typical entering class for an elite school—58 percent of the students came from families with annual incomes above $100,000—and in 2004 fewer than six percent of students came from families with incomes below $40,000. In 2004 Virginia announced that for students with family incomes below 150 percent of the poverty line it would eliminate need-based loans and would instead offer grants exclusively (the school has since raised the threshold to include families of four making less than 200 percent of the poverty line, or about $40,000). It would also cap the amount of debt any student could accrue, funding the rest of his or her tuition through grants. The school publicized its increased affordability, with large-scale outreach to poorer parts of the state. It's too early to judge the program's success, but the first year's results are instructive: the number of low-income freshmen increased by nearly half, or sixty-six out of a class of about 3,100. This is a praiseworthy if small step: those sixty-six brought the low-income total to 199, or about six percent of the class. But it does not solve the problem of unequal access to higher education.
Significant improvements in access, if and when they come, will probably have little to do with the policies at the most elite schools. In America access ultimately rests on what happens in the vast middle rank of colleges and universities, where most undergraduates are educated—in particular, in state schools.
One thing that's unlikely to happen is a sudden increase in funding for higher education, along the lines of the post—World War II surge that made college possible for so many young people. The budgetary demands of swelling entitlements and military spending, the wariness of voters who perceive schools (sometimes rightly, usually wrongly) to be growing fat off their high tuition, and the cultural chasm between a Republican-controlled government and a lefter-than-thou academy—all this and more ensures that spending on higher education will not leap to the top of the nation's political agenda. Instead, schools and legislators must be willing to experiment.
The good news is that there's no shortage of ideas. Bowen, for instance, points out that state schools might consider rethinking their relatively low tuition, which amounts to a subsidy for wealthy in-state parents. (Indeed, upper-income parents are increasingly choosing to send their children to state schools, presumably with just this advantage in mind.) These schools could keep their official tuition low while charging premiums for better-off applicants. Or they could follow the lead of Miami University, in Ohio, which recently raised in-state tuition to the same level as out-of-state tuition (from $9,150 to $19,730).
What should be done with the extra money? State governments might consider tying funding for schools more tightly to access—either directly, by rewarding those colleges that graduate larger numbers of low-income students, or indirectly, as Bowen and his co-authors suggest, by shifting funding from flagship universities to regional schools, which are more likely to enroll disadvantaged students.
More radically, states might ask how well they are serving their populations by funding public universities directly and allowing the universities to disburse the funds as they see fit. If the point of a public university is to hire superstar faculty members, build world-class research facilities, and compete with Harvard and Yale, then perhaps this way of funding makes sense. (It's worth noting that since the 1970s public schools have spent an increasing share of their funds on research and administration rather than on instruction.) But if the point is to make higher education more accessible, it doesn't.
The Ohio University economist Richard Vedder has suggested that states might consider offering less money to schools and more money to students, in the form of tuition vouchers redeemable at any public institution in their home state. These could be distributed according to financial need: if the average tuition in a state university system were $15,000, a poor student might receive a voucher for $15,000 and a wealthy student one for $3,000. Schools would have less of a financial incentive to admit mostly rich students. Vouchers might also simplify filing for financial aid; the economist Thomas Kane has argued that the sheer complexity of this process deters many low-income students.
Like class-based affirmative action, a voucher program might be able to command support from both sides of the political aisle. The system's market-based efficiency would delight free marketeers (Vedder is affiliated with the conservative American Enterprise Institute), and its potential for increasing access might win the support of egalitarian liberals. And a voucher approach to funding state schools would mean less direct state involvement in higher education, which would please academics and administrators tired of having cost-conscious legislators looking over their shoulders.
Governments and public universities may also have lessons to learn from for-profit schools, which increasingly attract the students shut out of American higher education. Driven by bottom-line concerns, some of these schools enroll students who can't do the work, or promise job opportunities that never materialize. But many are oriented toward the needs of low-income populations. In New York State, for instance, some commercial schools set tuition at around $9,000—exactly the amount that a needy student can expect to receive from a Pell Grant combined with the state's tuition-assistance program. And they tend to serve the kind of students that traditional universities are failing—working adults, for instance, looking for the economic advantages that come with a college degree.
What gives the for-profit schools a leg up is their ability to "unbundle" a college education from its traditional (and costly) campus environment—something made possible in large part by the spread of the Internet. Some for-profit schools are entirely Web-based. Many others have put their reading lists, class registration, and even advising online. This is obviously not a model that a flagship state university is likely to emulate. But it may no longer make sense to spend a vast amount to sustain a traditional campus experience for the few when the same amount can provide an education for the many.
All these experiments—and that's what they are—have drawbacks. Public universities that spend more to improve access and graduation rates could make up for the expense by cutting, say, faculty salaries. Public schools already have a hard time keeping sought-after teachers from jumping to private colleges; if more money were spent enrolling and graduating poorer students, the problem would only worsen.
And the more that market efficiency was brought to bear on higher education, and the more that degree-granting and graduation rates were emphasized over the traditional academic experience, the more the liberal arts would be likely to suffer. Computer classes would crowd out Shakespeare, management courses would replace musical instruction, everyone would learn Spanish and no one Greek. Who would speak up to save liberal education?
The most obvious drawback is that a more egalitarian system, in which a college degree is nearly universal and therefore a less exclusive pathway to later success, would run counter to the interests of upper-middle-class parents—the people who wield the most influence in the politics of higher education. It's elite Americans who would lose out in class-based affirmative action. It's elite Americans who would pay more if state schools raised their tuition and state governments handed out income-adjusted vouchers. And it's elite Americans who would lose some of their standing if educational opportunity were more widely distributed. Why should they give it up? It's not as if our child doesn't deserve his advantages, parents might say, after helping that child rack up not only high grades and SAT scores but also a sterling record of community service.
What, really, does an eighteen-year-old high achiever "deserve"? A good college education, certainly—but surely not the kind of advantage that college graduates now enjoy. As Nicholas Lemann put it in The Big Test, his history of the American meritocracy, "Let us say you wanted to design a system that would distribute opportunity in the most unfair possible way. A first choice would be one in which all roles were inherited … A second unfair system might be one that allowed for competition but insisted that it take place as early in life as possible and with school as the arena." Students should be rewarded for academic achievement. But twelve years of parentally subsidized achievement should not hand them an advantage for the next fifty years of their lives.