Sunday, May 28, 2006

Whence Come the Median LSATs and GPAs Used in the Rankings?

I here take up the last of the four changes that U.S. News and World Report made to its law school rankings this past year: moving from calculated median LSATs and GPAs to reported ones. This post offers some instructive background about the wending path that led to the present methodology. Why "instructive"? Because this wee bit of infometric history may teach us a great deal about how to reform the USN&WR rankings. I'll put off until the next post the one thing most readers probably care about: Who won and who lost by dint of the methodological change?

Prior to the fall of 2004, USN&WR asked each law school to calculate and report the median LSAT and GPA of its full-time first-year class. USN&WR did not ask schools to repeat the medians they had reported to the American Bar Association because that data did not exist. Back then, the ABA's annual questionnaire asked schools for only the 25th and 75th percentiles of their incoming students' LSATs and GPAs—not their 50th percentiles (i.e., their medians).

USN&WR changed its methodology in the fall of 2005, when it asked schools to relate what they had told the ABA about their 25th and 75th percentiles. USN&WR then averaged those percentile scores to generate calculated medians to use in its rankings. Why did USN&WR stop asking schools to self-report medians? Because it had, as Robert J. Morse, director of data research for U.S. News, put it, "heard that some schools weren't computing their median accurately."

Few law schools will risk lying to the ABA, which wields the power to strip a school of its precious accreditation. USN&WR wields no similar power, however. As far as I've been able to determine—and to my surprise—USN&WR does not expressly threaten dissembling schools with any sanctions. It does not even appear to ask law schools to promise to tell the truth. Perhaps it charitably (but, alas, counterfactually) assumes the best about those who respond to its law school questionnaires. Or perhaps, already conscious of the ire its rankings cause, USN&WR worries that straightforwardly demanding honesty would come off as too heavy-handed. But I digress, straying to a topic that I'd planned to cover later, under the heading of suggested reforms.

USN&WR's use of calculated median LSATs and GPAs in the "2006" rankings (released in the spring of 2004) stirred up a controversy. Prof. Brian Leiter, long a critic of the rankings, offered both his own take and, by way of a generous quote, that of a dissenting correspondent. I won't dwell on the wisdom of using calculated medians, however, as the topic has become moot. As Leiter later highlighted, USN&WR changed the way it measures LSATs and GPAs yet again, last fall.

For this year's rankings—the "2007" ones—USN&WR went back to reported medians. Why? "We used a calculated median last year because of an absence of verifiable data for the actual median, but the American Bar Association now requires schools to report these data, permitting us to confirm the figures submitted to us," the magazine has explained. As with the FinAid and Fac/Stu indicators I discussed earlier, in other words, USN&WR has once again followed the ABA's lead when it comes to quantifying the performance of accredited law schools.

What result did the new way of measuring LSATs and GPAs have on the most recent (the "2007") rankings? I'll take that up in the next post in my series. In the meantime, I need to finish grading and, since I gather I'm supposed to be enjoying a holiday weekend just now, perhaps even have some fun.

Earlier posts about the 2007 USN&WR law school rankings:
Change to U.S. News Law School Rankings Methodology;
"Financial Aid" Revised in U.S. News Methodology;
How USN&WR Counts Faculty for Rankings.

No comments: