Could a College Scorecard Backfire? It Did for Law Students

The president wants colleges to start collecting jobs data from their graduates. But it could cause more harm than good. Just ask some recent law grads.  

615_CooleyLawSchool.jpg

Wikipedia

Last week, President Obama unveiled a raft of proposals to rein in the rising cost of college. One of the most seemingly innocuous measures would require schools to begin collecting employment and earnings data from their recent graduates, so that potential students can have a realistic idea of their post-collegiate job prospects.

Simple. Transparent. Consumer-friendly. What's not to like?

In an ostensibly unrelated story a few days later, a group of lawyers filed suits against a dozen different law schools accusing them of using rosy, and grossly distorted, jobs data to dupe students into applying. They followed three similar suits filed last year, including one against Thomas M. Cooley Law School, which is the largest law school by enrollment in the country. The cases are seeking hundreds of millions of dollars, tuition refunds, and reforms to the way law schools calculate and present their jobs data.

Yes, these cases sound like a bad joke -- If a law school loses a suit to its recently graduated students, does that make it a terrible law school or a great one? -- but they offer a lesson that the administration should keep in mind if it's serious about pushing schools to collect job numbers. Bad data can be much, much worse than no data at all. And without serious oversight, there's a good chance you'll end up with some terribly misleading numbers.

The legal academy might be the ultimate case in point. As Paul Campos, a professor at the University of Colorado, noted in The New Republic, for years almost all 198 accredited law schools reported in U.S. News and World Report's annual rankings that at least 90 percent of their students had found employment nine months after graduation. Those numbers had helped convince generations of young college graduates to pursue a J.D., assuming their degree would be as good as guaranteed employment.

After the recession hit, and lawyers started getting laid off in droves, the cheerful jobs statistics started to look suspicious. U.S. News revised its standards, and suddenly, the employment rates started to fall.

But even the new figures might be far, far too optimistic. For instance, those stats don't tell you how many students had to take work outside the legal industry or are only employed part time. Campos dug into data for one top 50 school, and by the time he got done sifting, he estimated that maybe 45% of students had a full-time legal job nine months after graduation.

The Tennessee-based nonprofit Law School Transparency has been lobbying for changes in the ways schools collect and present data, and in December, the American Bar Association announced it would begin sending out a new survey to law schools that would allow it to audit their responses.

That would bring law schools much closer to the system MBA programs adhere to. Business schools began setting down uniform reporting standards back in 1994 and agreed to an auditing system in 2006. It isn't a perfect model. The University of Florida's B-school, for instance, was caught submitting incorrect stats to U.S. News in 2009. But it's much better than the essentially laissez-faire approach law schools have been taking.

All of this is to say that there are right ways to report job stats, and there are wrong ways. Beyond that, unscrupulous for-profit schools clearly aren't the only institutions that feel compelled to pretty up their job numbers. And if the administration wants to help students by providing them more information on schools' employment records, there will need to be a vigorous system for verifying all the data. That costs money, possibly for the government, and certainly for the colleges. It may be a worthwhile endeavor, but doing it right will come at a price. The cost of doing it wrong could be even steeper.

Just ask those pissed off law students.

>