Gordon Hylton (Marquette) has, in his view, made better use of the USNWR's own data in drafting "The US News and World Report Rankings Without the Clutter." He argues that "[t]he only categories that should matter in law school rankings are the quality of the students and the quality of the faculty" as measured by peer assessment score and LSAT score. His methodology and rankings can be found here. Under his methodology, the Top Ten Law Schools are:
1. Harvard
2. Yale
3. Stanford
3. Columbia
5. Chicago
6. NYU
7. Virginia
8. Michigan
9. Penn
10. Berkeley
Interestingly, he ranks all 179 law schools.
I had posted this previously on another site but I thought people here might also have some interesting insights: As I recall, the USNews uses a peer assessment score, but doesnt use any type of citations/faculty or articles/faculty score to discern faculty productivity. It seems that this would be an obvious point to consider and might address some of the problems with sluggishness in rankings since these scores might be very different from faculty prestige scores.
********************
Jeff Yates
http://www.uga.edu/pol-sci/people/yates.htm
SSRN: http://ssrn.com/author=454290
*********************
Posted by: Jeff Yates | 10 April 2006 at 12:57 PM
Of course, this assumes a lot, not the least of which is that student quality is measured by the LSAT. Have seen reports indicating undergrad GPA is more of a factor in how law students perform in law school, and the LSAT's relevance has waned (though it is not irrelevant). That's true even with the variation in undergrad grading systems. Relative GPA (relative to school mean) is obviously more accurate, but it is labor intensive and U.S. News does not use it in the rankings.
Hylton's ranking is also a purely academic look, in that peer assessment is valued more highly than the rankings by practicing lawyers and judges -- who, of course, actually see the graduates in practice and know what they are capable of. That seems to me to bear more on the quality of the students and the faculty more than a standardized test and how faculty members see each other.
The author is correct that lawyers and judges often guess, but so do other faculty members. In fact, every individual component of the rankings is flawed in a serious way. Eliminating all but two actually has the potential to make the "uncluttered" rankings even less accurate. At least with a diversity of measures some of the individual errors may be smoothed out.
A final comment: Students may use U.S. News to choose a law school, but big firm employers often evaluate law grads based on quality of institution, and U.S. News rankings are a factor in that analysis.
Posted by: Jeff McFarland | 10 April 2006 at 11:07 AM
Just glancing at the list ... it does not seem all that much different from the US News rankings. I wonder what the correlation between the two is? Who are the big "winners" and "losers" under this alternative measurement?
Interesting stuff.
Posted by: Chris W. Bonneau | 08 April 2006 at 10:12 PM
Thanks for the link to Gordon's paper--I always learn a lot from Gordon's legal history work. This paper has an elegant simplicity to it--which seems to bear a strong resemblance to my "hunch" about a lot of schools.
Posted by: Alfred L. Brophy | 08 April 2006 at 07:41 PM