Readers of my post on college rankings have made excellent points about the limits of all published rankings, and their comments on the original page are worth careful reading.
Nick observes that subjective and qualitative features of colleges are neglected by rankings publications, which promote excessive reliance on numbers. I agree the danger exists. Yet I haven't found strong evidence of how influential rankings have been in decisions. Has the popularity of US News changed the balance of peer institutions? I'm sure the question has been studied by college admissions officers, but not sure how much has been published.
As an academic specializing in psychological testing and measurement, crimfan points to academic papers raising serious questions about the validity of rankings, and to misunderstandings of the significance of numbers, especially to problems of spurious precision. rd calls for more flexibility in letting students and their parents weigh factors on their own and develop personalized rankings, and crimfan agrees.
I would add that the problem is a special case of a paradox going back to Plato. To evaluate colleges, a student needs the skills of quantitative interpretation (of ranking statistics) and close reading (of colleges' own statements) that he or she is going to college to get. When I attended a meeting of mathematics teachers and others involved in quantitative literacy, I discovered a wide range of opinions. (If you're interested, the papers and summary comments, including mine, have been published. Of two good college math departments, one may be much more interested in quantitative skills across the curriculum, while the faculty of another may, as one speaker put it, "want to clone ourselves." So one student's dream of personal attention and mentoring in math might be a less assured beginner's disappointment.)
High school seniors and parents unhappy with conventional rankings should be aware of an alternative supported by colleges themselves, the National Survey of Student Engagement (NSSE). It is sponsored by colleges and universities, mainly as an internal tool for improving teaching and learning, not for comparison. USA Today has made the voluntarily released scores of colleges available on an interactive web site, While some of the leading private colleges and state universities are included, many others are not. And there's no easy way to compile a list of high-scoring institutions. One of the big problems of social science research is how much of the really interesting stuff is not released to the public -- on the open Web or in paid-subscription journals.
One way to fill the gap: read commentary in student newspapers and alumni magazines. You'll get a more rounded picture of what's going on -- as in this candid look at academic and extracurricular life at Harvard. Two related policies complicate life for undergraduates at convetionally highly ranked research universities. One is "steeples of excellence," first implemented at Stanford after the Second World War, in which some schools and departments are deliberately given more resources and others, even originally strong ones, are more modestly funded or even dropped. The other is "the well-rounded class," as opposed to the all-around student; instead of going for the very highest academic credentials, some colleges make most of their admissions decisions ultimately for signs of extracurricular gifts, often very specific, like a particular instrument in an orchestra or a crucial position on a team. So the college newspaper may, more than ever, be the competitor of the classroom -- considering Harvard tuition, a negatively-paid journalism job.
In fact it's leading member of the Harvard faculty, the social psychologist Daniel Gilbert, who has pointed to our inability to predict how happy any decision is going to make us. He counsels instead relying on our "psychological immune system."