With the Stu/Fac indicator as with the FinAid indicator discussed earlier, USN&WR follows the lead of the American Bar Association. Prior to the fall of 2005, the ABA's annual statistical probe (ouch!) asked each law school to report the number of faculty teaching only that fall. The ABA then used that figure to calculate each school's student/faculty ratio. Ditto the USN&WR's inquiry and its calculation of the Stu/Fac ratio, which counts for 3% of each law school's score in the rankings.
That all changed last fall. The ABA started asking law schools to report both the number of faculty teaching during the prior spring semester and the number teaching that fall. It then combined those figures to generate a student/faculty ratio reflecting an entire year's worth of instruction. Ditto, again, the USN&WR questionnaire and Stu/Fac indicator.
Which schools won or lost the most thanks to this new way counting faculty? Because Stu/Fac now averages a school's student/faculty ratio from the reporting fall and the prior spring, it smooths out changes. The new methodology thus disadvantaged law schools that radically improved their student/faculty ratios last year. Herewith those six schools, along with the Stu/Fac ratios that they reported in the 2006 rankings, in the 2007 rankings, and the difference between the two:
|Texas Wesleyan U.||26.4||17.9||-8.5|
|St. Thomas U. (FL)||27.3||20.2||-7.1|
|U. of Tulsa (OK)||22.7||16.3||-6.4|
|Thomas Jefferson Sch. of L. (CA)||27.6||21.9||-5.7|
|U. of Mississippi||22.0||16.9||-5.1|
|U. of South Dakota||21.2||16.1||-5.1|
On the other side of the scale, this table shows the six schools that reported the biggest increases in their Stu/Fac ratios from 2006 to 2007, and that therefore most benefited from the change the ABA and USN&WR made to the faculty-counting methodology:
|North Carolina Central U.||16.7||18.8||2.1|
|U. of Idaho||14.7||16.8||2.1|
|Brooklyn L. Sch. (NY)||17.8||20.5||2.7|
|U. of Wyoming||14.4||17.2||2.8|
|Indiana U. Indianapolis||14.2||18.0||3.8|
|U. of Montana||13.0||18.2||5.2|
How much did those schools win or lose in the rankings? We can work up an estimate by halving the amount of change each of the above schools reported in its Stu/Fac ratio in the 2007 rankings (reflecting the likely effect of the change those rankings adopted for counting faculty), adding that number to each school's 2006 Stu/Fac indicator to generate a proxy for what its 2007 Stu/Fac would have been had the methodology not been changed, and plugging that alternative, "old fashioned" Stu/Fac number into my model of the 2007 rankings. Here are some representative results:
|School||Alt. 2007 |
|Change in 2007|
|Texas Wesleyan U.||9.4||1.8|
|St. Thomas U. (FL)||13.1||1.4|
|U. of Tulsa (OK)||9.9||1.3|
|U. of Wyoming||20.0||-0.6|
|Indiana U. Indianapolis||21.8||-0.8|
|U. of Montana||23.4||-1.0|
I emphasize that those represent only rough estimates. Please note, in particular, that they rely on the accuracy of my model of the 2007 USN&WR rankings. I'll describe that model, the way I created it, and how well it reproduces the actual rankings in a subsequent post. Note, too, that regardless of the gains and losses I estimate that the new Stu/Fac methodology caused, they represent mere transition effects. The new way of counting faculty will affect all schools equally in the next USN&WR rankings.
Regardless of who won and lost by dint of the new Stu/Fac methodology, it bestows two happy effects on us all. Firstly, law schools worried about the USN&WR rankings (i.e., all ranked law schools) will stop pushing faculty to go on sabbaticals, accept visitorships, or otherwise take leaves of absence only during the spring semester. Secondly, would-be students will get more accurate information about law schools' student/faculty ratios.
Other posts about the 2007 USN&WR law school rankings:
Change to U.S. News Law School Rankings Methodology;
"Financial Aid" Revised in U.S. News Methodology.