As I recently observed, the close fit between law schools' scores in U.S. News & World Report's rankings and the scores of those same schools in my model of the ranking "suggests that law schools did not try game the rankings by telling USN&WR one thing and the ABA . . . another." Since both Robert Morse, Director of Data Research for USN&WR, and the ABA Journal saw fit to comment on that observation, perhaps I should clarify a few points.
First, I have no way of knowing whether or not law schools misstated the facts, by accident or otherwise, to both the ABA and USN&WR. The fit between USN&WR's scores and my model's scores indicates only that law schools reported, or misreported, the same facts to each party.
Second, this sort of consistency test speaks only to those measures USN&WR uses in its rankings, that it does not publish with its rankings, and that the ABA collects from law schools: median LSAT, median GPA, overhead expenditures/student, financial aid/student, and library size. Measures that USN&WR uses and publishes—reputation among peers and at the Bar, employment nine months after graduation, employment at graduation, student/faculty ratio, acceptance rate, and Bar exam performance—go straight into my model, so I do not have occasion to test their consistency against ABA data. In some cases—the reputation scores and the employment at graduation measure, the ABA does not collect the data at all. This proves especially troubling with regard to the latter. We have little assurance that USN&WR double-checks what schools report under the heading of "Employment at Graduation," and no easy way to double-check that data ourselves.
Third, and consequently, USN&WR could improve the reliability of its rankings by implementing some simple reforms. I suggested three such reforms some time ago. USN&WR has largely implemented two of them by making its questionnaire more closely mirror the ABA's and by publishing corrections and explanations when it discovers errors in its rankings. (I claim no credit for that development, however; I assume that USN&WR acted of its own volition and in its own interest.)
Another of my suggested reforms remains as yet unrealized, however, so allow me to repeat it, here: USN&WR should publish all of the data that it uses in ranking law schools. It could easily make that data available on its website, if not in the print edition of its rankings. Doing so would both provide law students with useful information and allow others to help USN&WR double-check its figures.
To that, I now add this proposed reform: USN&WR should either convince the ABA to collect data on law school graduates' employment rates at graduation or discontinue using that data in its law school rankings. That data largely duplicates the more trustworthy (but still notoriously suspect) "Employment at Nine Months" data collected by the ABA and used by USN&WR in its rankings. And, unlike that data, law schools do not report "Employment at Graduation" numbers under the threat of ABA sanctions. We cannot trust the employment at graduation figures and USN&WR does not need them.
Among the reforms I suggested some two years ago I also included one directed at the ABA, calling on it to publish online, in an easily accessible format, all of the data that it collects from law schools and that USN&WR uses in its rankings. I fear that, in contrast to USN&WR, the ABA moved retrograde on that front. I leave that cause for another day, however; here I wanted to focus on what my model can tell us about USN&WR's rankings.
[Crossposted at Agoraphilia, MoneyLaw.]
Tuesday, August 04, 2009
Subscribe to:
Post Comments (Atom)
5 comments:
I never trusted the USNWR. I don't think it's necessarily a reliable measure of how good law school or any college is. However, the unfortunate truth is that those hiring look to these rankings to determine the quality of their applicants.
I'll meet you halfway, Tenrou, in that I agree that the rankings affect (at some margin) hiring decisions. But, while I would not take USN&WR's word at the ordinal ranking of law schools, it does offer some helpful information about the particular measures that it tracks. If I were a would-be law student, for instance, I would be interested in such things as a school's median LSAT and GPA.
It all turns on what the measure of a "good" law school is. If the biggest factor is the quality of job you can expect upon graduation (which is certainly one of, if not the most important considerations for prospective law students), then the overwhelmingly heavy weight given to peer assessments is valid.
Although the rankings may only affect hiring decisions at some margin, and the margin can be negligible between, say #57 and #82, it's pretty huge between #2 and #27. A fortiori for #82 and tier 4.
It will be interesting to see how UCI law does in its initial rankings, with its high median LSAT and GPA. More interesting, assuming it is ranked lower than schools in the region with similar stats, will be how its first graduating class's post-graduation job prospects will stack up against those of its established, highly-ranked peers. If it turns out that say UCLA and USC graduates get consistently better jobs, it will underscore the importance of school reputation, and perhaps give credence to the fact that USNWR's rankings are simply an accurate reflection of the "system" that has gone back since time immemorial.
ceh: Granted that UCI poses an interesting case, we will not know how USN&WR treats it for some time, as the USN&WR rankings cover only accredited schools, and UCI is several years from accreditation.
yes i agree tenrou ugetsu
_________________
http://www.ledermanlawgroup.com
Post a Comment