Wednesday, August 09, 2006

Reforming the USN&WR Law School Rankings

Earlier this summer, I began a series of posts about the U.S. News & World Report's law school rankings. (Please see below for links to each post in the series.) My research uncovered many interesting and troubling things about the rankings. I discovered errors in the data that USN&WR used for the most recent rankings and, consequently, errors in the way that it ranked several law schools. More distressingly, I discovered that almost no safeguards exist to correct or prevent such errors. I think it fair to say that, but for my peculiar obsession with the USN&WR rankings, nobody would have noticed the errors I've documented. That won't do. We cannot rely on one nutty professor to keep the rankings honest. I thus here wrap up my series about the most recent USN&WR law school rankings by describing several reforms designed to make law school rankings more accurate and open. Although I suggest all of them, implementing any one of these reforms would make errors in the rankings less likely, and surviving errors more likely to get corrected.

1. USN&WR's Questionnaire Should Mirror the ABA's

Both the ABA and USN&WR send law schools questionnaires each fall. The latter apparently wants schools to repeat their answers to the former. Judging from how it asked schools to report their median LSAT and GPA data last fall, however, USN&WR could do a better job of clarifying exactly what data it wants. To avoid honest confusion or lawerly logic-chopping, USN&WR's questionnaire should simply ask schools to repeat exactly the same answers that they put on the ABA's questionnaire.

2. USN&WR Should Commit to Publishing Corrections and Explanations

Law schools have a strong incentive to answer the ABA's fall questionnaire accurately, as that organization controls their accreditation. USN&WR, in contrast, wields no similar threat. Furthermore, law schools have a much more powerful incentive to dissemble on the USN&WR questionnaire, as their responses directly affect their rankings. What can USN&WR do to encourage law schools to give it accurate data?

USN&WR should commit now to publishing corrections to any inaccuracies it discovers in the data it uses to rank law schools. It should do so at all events, given that students use the rankings to make very important decisions. It can do so easily, too; it need only update its website. Yet USN&WR has thus far failed to correct the erroneous data it used to (mis)rank the University of Florida College of Law and Baylor University School of Law. For shame!

Perhaps USN&WR does not want to publicly acknowledge that its law school rankings sometimes contain errors, fearing that to do so would decrease the credibility of its rankings and, ultimately, its profits. Consumers will eventually discover the errors, though. Better that USN&WR should correct the rankings when necessary and thereby reassure its customers that it sells the best data available.

In addition to promising to correct errors in its rankings, USN&WR should also promise to document the cause of any errors it discovers. That double commitment would strongly discourage law schools from misreporting data on the USN&WR questionnaire. No school wants to earn a reputation for opportunistic lying. (Nor, of course, should any school suffer the wrongful imputation that it lied if, in fact, USN&WR causes errors in the rankings.)

3. USN&WR Should Publish All the Data it Uses in Ranking Law Schools

At present, USN&WR publishes only some of the data that it uses to rank law schools. Why? It is not at all clear. Even supposing that it would constitute an unwieldy amount of information in a print format, USN&WR could easily offer all the relevant data online. Specifically, USN&WR should publish the following additional categories of data for each law school it ranks:
  • median LSAT;
  • median GPA;
  • overhead expenditures/student for the last two years, which includes
    • instruction and administration expenditures;
    • a cost-of-living index applied to the preceding sum;
    • library operations expenditures;
    • law school miscellaneous expenditures; and
    • full-time enrollments;
  • financial aid expenditures/student for the last two years, which includes
    • direct expenditures on students; and
    • indirect expenditures on students;
    • (as well as the same full-time enrollments figures used in calculating overhead expenditures/student, above); and
  • library resources.

Publishing all that data would allow others to double-check it, thereby helping to keep law schools honest and the law school rankings accurate.

4. The ABA Should Publish the Data it Collects and that USN&WR Uses to Rank Law Schools

At present, a law school must pay the ABA $1430/year to receive "take-offs" summarizing the data that the ABA has required all member schools to report. The ABA marks that data as "confidential" and forbids its unauthorized publication. As I discussed earlier, the ABA apparently treats law school data that way not to protect law schools or the ABA from public scrutiny, but rather to increase ABA revenue.

Given that revenue model, the ABA has a strong disincentive to publicly disclose all the data it collects from member schools. Fortunately, however, it need not do so in order to improve the USN&WR rankings. Rather, the ABA need only publicly disclose those few categories of data that it collects and that USN&WR uses to rank law schools. Together with the Law School Admission Council, the ABA already publishes much of that data in the Official Guide to ABA-Approved Law Schools. It remains only for the ABA to publish data in the following categories:
  • overhead expenditures/student, including
    • instruction and administration expenditures;
    • library operations expenditures; and
    • law school miscellaneous expenditures;
  • financial aid expenditures/student including
    • direct expenditures on students; and
    • indirect expenditures on students.

It would greatly help, too, if the ABA would publish in a conveniently downloadable form that and the other data that USN&WR uses in its rankings. The Official Guide to ABA-Approved Law Schools currently comes only in paper or PDF formats, making it necessary to scan or re-key the data needed double-check the USN&WR rankings. That grindingly tiresome process invites the introduction of errors, throwing a needless hurdle before those of us interested in improving the law school ranking process.

As I said, adopting any one of the reforms I suggest would improve how law schools get ranked. Adopting all four would prove better, yet. Please note, though, that I do not promote these reforms for the sake of USN&WR. It seems quite capable of milking the rankings cash cow without my help. Rather, these reforms stand to benefit all of the rest of us—students, professors, and administrators—who live in the shadow of USN&WR's law school rankings.

By opening up public access to the data used to rank law schools, moreover, the reforms I've proposed make it more likely that alternatives to the USN&WR rankings will grow in popularity. Rankings require data, after all. In a better world, the ABA would make lots and lots of data about the law schools it accredits freely available in an convenient-to-use format. Those of us who doubt that USN&WR has discovered the one sole Truth about how to measure law schools might then easily offer the world our own, new and improved, rankings.

So ends my series of posts about the most recent USN&WR law school rankings. I thank my gracious host and co-blogger, Glen Whitman, for putting up with my often-dreary march through the necessarily statistical and administrative arcana. Readers—if any!—who share my interest in these matters may want to note that I plan to write an academic paper relating and expanding on the observations I've made here. Please feel free to drop me a line if you have any suggestions about how I might make such a paper useful to you.

[NB: Cross-posted at Moneylaw.]

Earlier posts about the 2007 USN&WR law school rankings:


Anonymous said...

What about the dubious exclusion of 2L transfers? This skews the reality and encourages some schools (ie Georgetown) to artificially constrict their 1L class and then open the floodgates to 2L transfers. It is unfair and unhealthy for the system. It would also likely stop in heartbeat if those transfer seeking schools had to report the LSAT numbers of the transfers as part of that year's entering class.

Tom W. Bell said...

Agreed, Anonymous, that some schools use transfers strategically. But the blame, if such there be, lies not with the ABA. USN&WR simply follows the ABA's lead in terms of deciding which LSATs and GPAs to count.

So, should the ABA ask each school to report an aggregate LSAT and (undergrad) GPA for all its students? I'm not so sure. LSATs mean a lot more for 1Ls, after all, given that they are the best predictors of subsequent law school success. But transfers come with still better predictors: 1L grades. So we arguably *should* discount the importance of LSATs after the first year of law school.

David Bernstein said...

Why would U.S. News undertake the reforms you suggest if, as you suggest it's likely to lead to an increase in popularity of alternative rankings?

Tom W. Bell said...

Good question, David B. I've worried about that rhetorical effect, too. It arises from my attempt to simultaneously persuade two different audiences, which have varying interests: USN&WR and those who dislike the influence of its rankings.

Perhaps I undertook a fool's errand. I think, though, that USN&WR would on net have an interest in adopting (most of) the three reforms I suggested if the ABA would only adopt the fourth. In other words, USN&WR would have little choice but to pre-commit to correcting its rankings and disclosing the data it uses therein if it knew the ABA was going to release almost all the same data, anyhow.

Anonymous said...

I agree with the previous comment that U.S. News has very little motivation to do business differently. However, if U.S. News doesn't have the motivation to make these reforms, the ABA could reform U.S. News by policing the consumer information that law schools provide to commercial rankings and ratings companies.

The ABA should require all schools to publish (via Internet) the complete response to any commercial ranking and review publication request for information. Schools must ALWAYS provide consumers accurate information, not just in the annual survey. If you want to play the U.S. News game, you should be willing to do this. If you don't want to make your submission public, then you can't play.

We should expect more from the ABA, since we can't expect more from U.S. News. The results might be similar.

Tom W. Bell said...

I like your suggestion that schools publish (at least some of) their rresponses to the USN&WR questionnaire, Anon. I say "at least some of" because USN&WR asks for a lot of data that it does not (yet) use in the rankings. There's no need to repost that data, so far as reforming the rankings goes.

I'm not so sure I'd want the ABA to mandate republication of the data, though. The ABA already asks too much of law schools. And the new mandate you propose would require a bit of work, since USN&WR has schools fill out web-based forms. But, granted, it's just a question of re-entering the data.