Friday, May 27, 2005

Blissful or Blistering Ignorance?

Curiously, for a news magazine, U.S. News & World Report (USN&WR) apparently favors keeping its readers in ignorance. In its rankings of law schools, USN&WR gives individualized scores—and thus rankings—for only the top 100 or so law schools. (By convention, that group of law schools constitutes Tier 1 and Tier 2, each of which includes about 50 law schools.) The rest of the 80 or so ABA-accredited law schools that USN&WR ranks, it divides into two groups of roughly equal size, Tier 3 and Tier 4. Beyond that, USN&WR does not rank Tier 3 or Tier 4 schools. Instead, USN&WR lists the schools in Tier 3 and Tier 4 alphabetically, by name, and offers some (but hardly all) of the data that goes into calculating each school's overall score.

USN&WR must, of course, calculate an overall score for every law school that it evaluates. How else could USN&WR figure out where each law school fits in its rankings? Yet, as I've described, USN&WR reports individualized scores and rankings for only the law schools in Tier 1 and Tier 2. Why does USN&WR refuse to reveal the overall score for each school in Tier 3 and Tier 4? Are we better off not knowing that information?

I've been unable to find an explanation by USN&WR of its reasons for not revealing the individualized scores and rankings for Tier 3 and Tier 4 law schools. I can thus only guess at its motives. I'm sure that USN&WR has the unreported data ready-at-hand and that publishing it would interest many of the magazine's readers. It thus has the ability and at least some incentive to reveal the scores of Tier 3 and Tier 4 schools. What disincentive stops it from doing so?

I can think of no particular legal liability that would arise from USN&WR publishing the scores of Tier 3 and Tier 4 schools. If hurtful rankings supported viable lawsuits, litigation would have begun long, long ago. Yes, the schools ranked at the bottom of Tier 4 would complain. But law schools have been voicing similar complaints about USN&WR's rankings for years and years. I thus don't think that crassly financial considerations motivate USN&WR to keep us ignorant of all the law school scores it calculates.

What else could motivate USN&WR? I wonder if mercy does. That, at least, has thus far restrained me from publishing the scores of Tier 3 and Tier 4 schools. As I mentioned in an earlier post, and elaborated on in a subsequent one, I've managed to "reverse engineer" how USN&WR ranks law schools. I related how that has allowed me to recalculate the scores of Tier 1 and Tier 2 scores. But, of course, I've also recalculated the scores of the Tier 3 and Tier 4 schools.

As a commentator to one of my prior posts observed, "Readers might find it interesting if you used your methodology to post the rankings within the third and fourth tiers of U.S. News." Indeed, they might. I've found poring over the scores of the Tier 3 and Tier 4 schools fascinating. Generating that data largely motivated my reverse engineering project, after all.

So why haven't I revealed those scores? Among the several reasons that give me pause, mercy counts for a lot. I am not a law school snob, after all. I can hardly afford to be one, given the (temporarily!) low rank of the (spanking new and thus largely unknown!) school where I teach. But even if I were teaching at a much more highly ranked school, I like to think that I would continue to respect the need for and good work of less well-ranked schools. To point out that a law school had scored at the very bottom of the USN&WR barrel would feel too much like kicking a colleague in the ribs. Better, I think, to let those unfortunate schools hide in unranked crowds, arranged not by scores but by names.

I recognize that an appeal to mercy applies best to Tier 4 schools. No Tier 3 school need fear being labeled "Worst Law School in the Nation!" Even with regard to Tier 3 schools, however, I hesitate to reveal individualized scores. Here, I have a motive that I doubt USN&WR feels so keenly: I don't want to encourage more obsession over the USN&WR rankings.

I think that the legal community already pays too much attention to USN&WR's rankings. To accept those rankings is to accept the weight USN&WR gives to each aspect of law school performance that it tracks. But does some law of nature require that, for instance, selectivity in admissions count for 2.5% of a school's overall score? No. That shows a reasonable choice by USN&WR, perhaps, but a choice that different people could make quite differently. In other words, actuarial mileage may vary.

All that said, I must say that I am tempted to publish a fascinating chart I've generated that shows the downward slope of the overall scores of all schools ranked by USN&WR. That chart does not reveal any particular school's score, mind you. I've even stripped out the scale from the chart's y-axis. It still proves interesting, however, for showing that schools' scores drop following not a straight line, nor in the shape of one-half a curve of normal distribution, but in a more complex—though still smooth--curve. The curve's shape may even explain USN&WR's policy. I'll hold off on publishing it for now, though, as I don't want to let out of the bag a cat that I cannot thereafter catch.


Anonymous said...

Tom, one possible explanation is that the research methodology, established to evaluate differences between elite schools, is simply inappropriate to evaluate lower tier schools. For example, selectivity may be a fine way to distinguish between nationally drawing Yale and Stanford, but for lower tier schools that draw more regionally, a school's mere location may allow it to be more selective. I guess you could argue that fact alone should give that school a higher ranking, but now the logic is different, and thus the methodology is invalid.

It would be like evaluating race cars: At an elite level, minor variations in aerodynamics or frame stiffness play a huge part. At a lower tier of racing, however, maybe only driver skill and horsepower are needed to win. The point is that the evaluation criteria you would establish to judge elite cars and tier III cars may not be the same, and that if your reputation is at stake, you don't publish Tier III rankings using your Tier I method.


P.S. If you are below the University (sic) of Georgia, does it really matter?

Anton said...

My dad (a college teacher) once grumbled that the change from "A,B,C" grading to "A,A-,B+,B,B-,C+,C" (at UIUC circa 1981) was meant to make the question of cutoffs less worth quibbling about, but of course tripled the number of students who were one point below the cutoff; and each complained as loudly as before.

The schools in Tier 3 can all tell themselves they're near the top of Tier 3. Rank them and you take away that private consolation, making at least half of them more likely to complain.

Tom W. Bell said...


I agree that USN&WR's methodology may be best suited to elite schools, though I am not sure I would say that lower tier schools can be more selective. To the contrary, I suspect that those schools, in large part due to their low place in the rankings, generally do not get the students with the best LSATs and GPAs. So, at least, the data about their students' LSATs and GPAs, and about those schools' application/acceptance ratios, would suggest.

Still, it remains puzzling that USN&WR would have that view. I should think it would have an interest in presenting its rankings as universally valid. And, as I observed, people *do* care about the rankings of even the lower schools. Granted, maybe the people affiliated with the top schools don't care. But that still leaves a big potential audience that USN&WR is passing up.


What you say makes sense as a matter of human nature. But does USN&WR really care about the number of complaints it gets from ranked schools? I rather doubt it. Its been getting loads of complaints for years--even from top schools. Being well-practiced in ignoring whining, why doesn't it ranki *all* schools and walk away with the extra sales of its guide it would thereby generate?