
For details about how and why I modeled USN&WR's law school rankings, as well as for similar snap-shots, see these posts from 2005, 2006, 2007, and 2008.
Perhaps in later posts I'll offer some reflections on what this year's model of the USN&WR rankings teaches. For now, I'll just offer this happy observation: The close fit between USN&WR's scores and the model's scores suggests that law schools did not try game the rankings by telling USN&WR one thing and the ABA (the source of much of the data used in my model) another. Even a skeptic of law school rankings can find something to like in that.
[Crossposted at Agoraphilia and MoneyLaw.]
3 comments:
Does your model identify any problem with Brooklyn Law School, which reported its full time enrollment statistics and represented them as the enrollment statistics for its entire class -- as explained further here:
http://www.usnews.com/blogs/college-rankings-blog/2009/05/18/what-happened-with-brooklyn-law-school.html
US News has failed to correct Brooklyn's rankings, but agrees that the ranking is incorrect due to this incorrect reporting.
Anon: Because I am interested in copying USN&WR's results, I treated Brooklyn the same way that it did.
If you run the Brooklyn numbers with the full-time and part-time data combined, do you get a different scaled score that would change Brooklyn's rankings? If so, by how much?
Post a Comment