Tuesday, May 24, 2005

The Puzzle of Penn Law School's Ranking

I've pretty well finished reverse engineering the way that U.S. News & World Report (USN&WR) ranks law schools, a project I've been working on for quite some time. I'll spare you the gruesome details about exactly how I built my model and instead simply show you a snap-shot of the results: A comparison between the scores that USN&WR gave to each of the top 102 law schools in its latest rankings versus the scores generated by my model.

Accuracy Chart

As you can see, my recalculated scores generally came within a few percentage points of those assigned by USN&WR. I attribute that minor drift to the effects of the one bit of data that I could not recreate: the cost-of-living adjustment that USN&WR applies to some of the figures it tracks under the heading of "expenditures per student." I worked up a proxy, but it evidently doesn't track USN&WR's cost-of-living adjustments perfectly.

There remains just one glaring mismatch between USN&WR's scores and mine. See that dip near the upper left end of the chart? It shows that USN&WR assigns a notably higher score—9% higher—to the University of Pennsylvania School of Law (Penn) than my model suggests it should. The cost-of-living adjustment cannot explain that much of a difference. I have a another, more troubling explanation.

If you'll allow me to gloss over some gory details, it boils down to this: Although Penn spends about as much as its neighbors in the rankings, it apparently spends its money rather differently from them. So, at least, it told the ABA, whence I got the data for my model. So what? Well, how a school classifies its expenditures makes big difference in the USN&WR rankings. Hence my model's comparatively low score for Penn.

Why did USN&WR give Penn such a high ranking, then? That's the puzzle. I surmise that the classification of Penn's expenditures somehow got "fixed" for the USN&WR rankings. How might that have happened? I don't know. I will observe, however, that it does not look as if Penn simply made an error in reporting its expenditures to the ABA and then got USN&WR to correct that mistake, since Penn has stuck to the same basic classification scheme for at least two years running.

I don't want to make too big a fuss about this puzzle. Penn seems like a quality school, and goodness knows I'm not going to present the USN&WR as the gospel truth in assessing law schools. But it perhaps bears noting that my model would drop Penn from its current slot, at 7, down to about 15. And, interestingly, that roughly corresponds to the ranking that Brian Leiter gives the school in his assessment of scholarly impact and his assessment of law school teaching.

Of course, a lot of people would be interested if some sort of hanky-panky explained this puzzle about how Penn did so much better in the USN&WR rankings than it seems it should have. I've already pointed out the discrepancy to USN&WR. I didn't ask for a response, however, and have not gotten one. I'm not a muck-raker, and I repeat that I don't know why Penn appears higher in the USN&WR rankings than it does in my (apparently pretty accurate) model. I present this merely as an intriguing puzzle, one that probably illustrates the vagaries of law school ranking methodologies than anything else.

18 comments:

Anonymous said...

The AALS Deans sponsored a statistical study that claims that almost all variance in rankings comes from LSATs and reputation. The other factors exist, and are used in the USNWR formula, but have very little effect in the rankings, or so the Deans' study said. Did you conclude differently?

Anonymous said...

Readers might find it interesting if you used your methodology to post the rankings within the third and fourth tiers of U.S. News. As you know, U.S. News posts the schools in these tiers alphabetically, so your methodology might shed some light on the rankings within these tiers.

Tom W. Bell said...

Anon of 7:02: If you look at the top headings in the chart in my second post on this topic, you will see that a school's reputation and LSAT scores together count for 52.5% of overall score. The AALS Deans were thus correct to say that those measures account for "most" of the variance among schools' rankings. But it would perhaps mislead to claim that the other factors have "very little" effect on the rankings. The have comparatively little, granted. But woe unto the school that totally ignores the factors that make up 47.5% of its score!

Anon of 8:27: I have of course generated scores and, thus, rankings for the third and fourth tier schools. Indeed, I reverse engineered the USN&WR rankings system largely to get that data. I've here focused on the top schools because I've wanted to demonstrate the accuracy of my model by comparing the figures that USN&WR publishes for those schools (and those schools alone) with my recalculated ones.

Should I publish my results for the third and fourth tiers? It's such an interesting question that I think I'll devote a separate blog post to it. Please check back on the main page for my comments--or, more likely, my inconclusive questions.

Glen Whitman said...

Tom -- The percentage of the score attributable to an item is not the same as the variance attributable to that item. It might be that the items in the other 47.5% vary little from school to school, and therefore have little effect on *relative* rankings.

Tom W. Bell said...
This comment has been removed by a blog administrator.
Tom W. Bell said...

[I reposted this to clarify and add to my analysis of the contrast between what the AALS Deans allegedly said and what the data shows.]

I see what you mean, Glen. I was referring to the weight given to each category's z-scores. But maybe you're right that the Deans were referring to the degree of variation within category. Even if they weren't, that's an interesting thing to look at.

I believe that the coefficient of variation would give us the dimensionless measure that we need to compare the spread of the different categories that USN&WR measures. Here, then, are the coefficients of variation of those categories:

Peer Rep: .32
Bar Rep: .26
LSAT: .03
Emp9: .07
GPA: .05
Overhead/Stu: .35
Emp0: .24
Stu/Fac: .22
Accep: .34
Bar Pass: .17
FinAid/Stu: .72
Library: .49

On your take, Glen, the Deans would be saying that variations between schools in the top three categories have the *greatest* effect on the rankings because scores in those categories have the *least* variance. If mean LSAT scores ranged between only 165 and 175, for instance, a one-point jump in a school's mean LSAT would have a big effect on its ranking.

But as the above table indicates, the top three categories evidently do not exhibit the least amount variation. Granted, the third of those scores--the LSAT score--has the lowest variation of all categories. But that's emphatically not true of the top two scores, those measuring reputation. Rather, the reputation scores exhibit only a middling-to-high degree of reputation. To illustrate, consider the same data rearranged from the least variation to the most:

LSAT: .03
GPA: .05
Emp9: .07
Bar Pass: .17
Stu/Fac: .22
Emp0: .24
Bar Rep: .26
Peer Rep: .32
Accep: .34
Overhead/Stu: .35
Library: .49
FinAid/Stu: .72

Anonymous said...

Penn Law has previously been caught giving different data sets to the ABA/AALS and USNews before - search the Chronicle of Higher Ed, they reported on this.

Anonymous said...

"Penn Law has previously been caught giving different data sets to the ABA/AALS and USNews before - search the Chronicle of Higher Ed, they reported on this."

Has anyone had any luck finding this article? I'd like to see it but cannot find it.

Tom W. Bell said...

Regarding the Chron. article, I ran some searches on the magazine's website and on LEXIS (which carries the magazine) and got nothing. So I, too, would be interested in learning more. Right now, I must regard the claim that Penn. deliberately reported different numbers to the ABA and USN&WR as unsubstantiated.

I *did* find a story dated May 31, 1999, from The National Law Journal, that explored the reasons that some schools (including Penn) reported different stu/fac ratios than those used by USN&WR in that year's law school rankings. That was the first year that USN&WR had tracked stu/fac ratios, though, and the discrepancies seemed most easily attributable to innocent confusion about what number USN&WR wanted.

Anonymous said...

I've heard that some Penn alums have come into prominent positions at the US News organization, and the rise in the undergrad and law school rankings correlates with this.

Anonymous said...

Well, Bush "heard" there were weapons of mass destruction in Iraq...

-good one.

Anonymous said...

From your other post, you gave Penn -2.24 for student/overhead costs.

Are we really supposed to believe that out of the entire top 50 Penn spends the least in that area by several standard deviations?

More likely, you got the calculations wrong.

Anonymous said...

I noticed that the -2.24 number for Penn seemed way off base, too. Are you sure you have your facts and calculations straight here?

Tom W. Bell said...

Anons who raised the question of Penn's -2.24 z-score in the "expenditures/student" category: That number accurately reflects the data that Penn reported to the ABA (though probably not to USN&WR). How could Penn have such a low number, there? Basically because Penn characterized its expenditures in an unusual way. More specifically, it characterized them in so as to show very low expenditures in the aforementioned category and very high ones under "financial aid/student." Look at that category and you will see that Penn has an incredibily high z-score.

Put another way, Penn reports expenditures roughly equal to other schools near it in rank. As it characterized its expenditures in its ABA report, however, it spends its money rather differently from its peers. And, notably, it spends it in ways that disadvantage it in the USN&WR rankings. Compare the weight of afforded to "expenditures/student" with the weight afforded to "financial aid/student."

Anonymous said...

Reed Shuldiner, when he was an associate dean in the law school at penn, claimed that if penn law received a $20,000 alumni gift, that penn's central university administration "taxed" away $2,000 & "returned" $18,000 to penn law, then penn law recorded that entire transaction as penn law receiving $38,000. He went on to claim that such "creative accounting" was justified because there is no analogue to GAAP for making reports to U.S. News & World Reports & that penn's peers also do this. It would be worth investigating whether penn law did (& still does?) engage in such fraudulent & unethical reporting, especially in light of it being a law school which teaches professional ethics to law students & the recent wave of U.S. corporate accounting scandals.

Ben said...

Perhaps U Penn has a university wide policy of pushing the ethical limits of “gaming” the rankings. Brian Leiter's post on the U.S. News rankings of undergraduate institutions suggests that Penn may be fudging numbers. Leiter writes:

The University of Pennsylvania also once again tied with Stanford (at 5th), and just a notch behind MIT at 4th, both schools that are so substantially better than Penn, that one wonders at the audacity of the editors to perpetrate this fraud on the reading public. (Same goes for putting Duke up there as well.) Penn, alas, is notorious for gaming the rankings; as one former Penn Dean said to me, "I'd hate to be around if they ever audited the books." Indeed.

Just a Thought said...

The funny part is, after a few years, people will genuinely believe the gamed rankings because they will have been consistent.

It surprises me more that the other schools are more scrupulous, not that Penn is less. The schools can't fudge the data for the ABA, but for a publication such as USNews its seems to me that the worst consequence is the development by the readership of a true understanding of how unrepresentative these rankings can be.

Anonymous said...

I think Penn i doing well and has got a certain benefit spending his money in some different way to his peers. This is really good effort.