[NB: Thanks to an email from Paul L. Caron, of the TaxProf Blog, I here offer a corrected version of my earlier post on this topic.]
As I said yesterday, I plan a series of posts about what I've recently learned about the U.S. News and World Report's law school rankings. Let me start here with a change that I've not seen reported elsewhere: This year's rankings saw a notable change in how the employment of a school's graduates affects its overall score.
In calculating its law school rankings, U.S. News and World Report takes two measures of a school's success at finding employment for its students: the percentage of graduates employed nine months (call it "Emp9") after graduation and the percentage employed at graduation ("Emp0"). Until the most recently-released rankings—the "2007" rankings—the Emp9 measure has counted for 12% of a school's overall score, while Emp0 has counted for 6%. Starting this year, however, Emp9 counts for 14% and Emp0 for 4%. In other words, U.S. News moved 2% from the Emp0 to Emp9. (For more details about the methodology of the rankings, see here.)
Did the new way of measuring law schools' placement efforts affect the rankings? For some schools, almost certainly. Some schools did considerably better on the Emp9 measure, relative to their peers, than they did in the Emp0 measure. Here are the schools that most benefited this past year from the change: Albany Law School-Union U. (NY) and U. of Memphis (Humphreys). Each of those schools gained .03 points in the ranking's 100 scale. Conversely, California Western School of Law and Texas Southern U. (Marshall) were most hurt, each losing .05 points.
Of course, those calculations tell us only about the effect of the change on schools' actual scores—not the rounded scores that U.S. News reports and uses in ranking schools. The change in how a school's placement efforts affects its rankings thus proved most telling to schools that had actual scores near the margin of the next highest or lowest rounded integer score. Since U.S. News doesn't report the actual scores, we can only guess which schools moved in the rankings due the new methodology.
Thursday, May 25, 2006
Subscribe to:
Post Comments (Atom)
2 comments:
CEH: I think it more likely that you hear the sound of a *few* law school administrators cheering, a *few* grumbling, and by far the most just shrugging. Thanks to the many mysteries veiling the USN&WR rankings, very few schools have a firm idea about how and how much methodological changes affect them. I aim to help dispell some such ignorance, but must admit that I wasn't able to help much concerning the employment measures discussed in this post.
It's really amazing how much influence the media has on academia. You would think that intelligent people would refrain from such naive bickering over arbitrary rankings. Furthermore, if you actually consider how these rankings are conducted you would see that they really have nothing to do with quality or "career value".
If there one is thing that has certainly become clear to me in the field of law, it's that your success as an attorney is not dependent on which school you graduate from. In the end, it's just a piece of paper you own. I've known terrible attorneys that graduated from "top" (and I mean top in the sense of these subjective rankings...as in 1-10!) law schools.
I am sure that if you ask any experienced attorney, they will tell you that you acquire most of your legal knowledge through experience and not during law school!
The lesson to learn: how successful you are as an attorney (or anything in life, for that matter) is not dependent on the writing on a piece of paper!
Post a Comment