Why do my fellow rankings geeks care about z-scores? In brief, these z-scores measure how well each school performed relative to its peers, thereby establishing its rank. (See here for a fuller explanation.) Because USN&WR uses z-scores to rank law schools, so too must any model of its rankings.

I weighted these z-scores simply by multiplying the z-score for each school, in each category of data, by the percentage that that category influences a school's overall score in USN&WR's rankings. That method of presenting z-scores has the virtue of highlighting which scores matter the most. You will thus generally find the largest weighted z-scores in the upper, left-hand corner of the chart, for instance, where lie both the most important categories of data and the law schools that scored the highest the rankings.

Consider, for instance, the weighted z-scores of .68 enjoyed by both Yale and Harvard under the "PeerRep" category. Numbers that large (comparatively speaking) overwhelm the effect of other measures of those schools' performances—the schools' BarRep scores, at .39 each, come in a distant second—and have twice the impact of the peer reputation scores of schools ranked as close as 20th from the top.

Using weighted z-scores also has the virtue of showing how very little influence many of the things that USN&WR measures have on its rankings. The weighted z-scores for Bar pass rates among top-tier schools, for instance, vary between only .07 and -.02.. Bar pass rates, however important to students, evidently do not matter much in USN&WR rankings.

Why did it take me so long to finish this year's model? In large part, you can blame my prepping two new classes (Property and a Law & Economics seminar) and serving on Chapman's Dean Search Committee (an effort that should soon conclude with our announcment of a fantastic new leader for our law school). Notably, though, some of the delay stems from how the ABA manages its statistical take-offs. The ABA recently abandoned its former practice of routinely sending electronic copies of its statistical take-offs at the request of any subscribing school. Allegedly, some Deans had complained that to make the data available electronically would make modeling the USN&WR rankings too easy. Nice try, Deans! Also, the ABA this year neglected to send several subscribing schools, including my own, even

*hardcopies*of the statistical takeoffs. We got a prompt response from the ABA when we finally figured out that we we were not to blame for the missing take-offs, but the mix up still impeded my efforts. Again, though, geekery finally prevailed.

Interested in prior years' z-scores? Here are the ones from the 2010 rankings, the 2008 rankings, the 2007 rankings, the 2006 rankings, and the 2005 rankings.

[Crossposted at Agoraphilia, MoneyLaw.]

## 2 comments:

It looks like lawyer reputation is not second in importance. That honor appears to go to a column near the middle (of which I cannot read the heading) which has higher top scores.

Where did you get the cost of living indexes used to convert the resources factors? And what formula did you use to create synthetic employment at graduation numbers for schools not reporting it?

Jeff Stake

Jeff: Those scores are only the highest for the top schools, and only because they so blow the competition out of the water on those measures.

I use my own CoL formula, worked up based on what I could discern about USN&WR's formula. Ditto the synthetic employment numbers.

Post a Comment