« Positive Approaches to Constitutional Law and Theory | Main | Law Blogs & the "Juice" »

13 October 2006

Comments

John Rooney

I read the 1975 article which compared law schools back then. As I vaguely recall Northwestern and the U. of Washington did surprisingly well because they had good ratios of profs to students and books to students. I think Leiter's comparisons sometimes use that approach. For example, he tells us endowment per student as do some rankers of colleges.

Stephen M (Ethesis)

he ABA could complete Kelso's vision by collecting data and sponsoring research on the relationship between law school programs and various measures of lawyer success. When prospective students know how their tuition dollars relate to their longer term personal and professional goals, these empirical facts will have greater force than the U.S. News "approximation[s]" of educational quality. Students will vote with their feet.

I think that is very true, and it is what the students are looking for (and think they are getting) when they read rankings.

Rick Lempert

There are many problems with law school rankings. At one time I looked closely at the U.S. News rankings and ALL measures were seriously flawed. (I could send a never published draft article to anyone interested.) Some problems have been corrected, in part in response to comments from me and others because U.S. News has always wanted a quality survey, but many flaws remain and various biases are built into the survey. The two biggest problems in my view are (1) the extent to which the rankings can affect law school behavior in counterproductive ways and become self-fulfilling prophecies and (2) the fact that a valid unidimensional ranking is impossible. A Consumer Reports type ranking system in which relatively similar schools were clustered would be more defensible, though there would be obvious unfairness at cluster borders. It is, ironically, the very impossibility of developing a consensually valid ranking of schools that helps make the U.S. News rankings seem on target. They look right. But what is overlooked is that there are many different rankings that would look right. This wasn't always the case. In the ranking issue that started the series (there was one based on different methods a year or tow before), Harvard was number 5 as I recall, which by itself called the validity of the rankings into question. The next year Harvard had moved to #2, as U.S. News changed its weights and criteria. Indeed, when last I looked (about a decade ago) many changes from year to year were due to changes in the U.S. News criteria and weights rather than to changes in how the schools stood on various measures in relation to each other. Also in those early days if schools did not cooperate by providing U.S. News with information (and more information was confidential then) U.S. News "estimated" their position on these dimensions, often the schools felt at an unrealistically low level, thus penalizing them in their overall ranking and inducing them to provide their true data the next year. (I looked at this and found some egregious examples, but for the most part the estimations, if I remember correctly,did not seem vastly diustorted.) Also from the start schools gamed the system or cheated. The first year the U.S. News was able to get accurate LSAT data through the ABA it reported that something like 30 schools had provided elevated data when they thought U.S. News had no ways of checking. Other kinds of gaming the system also happened. For example, I was told of one school that admitted all their minority students to their evening school sicne rankings were based on day school statistics and then alowed those who wanted to transfer freely to the day division. A final anecdote. I presented the draft paper at one school in U.S. News 5th tier. The school was in my view a school with a number of quality faculty and had a right to be distressed. I did not hear a good word about the survey while at the school. Several months later a new ranking came out and a faculty member sent me a copy of a memo the law school dean had distributed bragging about the school's rise to the fourth tier. What the memo did not say was that U.S. News had eliminated the 5th tier and the 4th tier was at the bottom!
Rankings are not necessarily bad. In my experience at Michigan some of the very best teaching goes on at the business school. Business schools don't care much about the U.S. News ratings because their students are interested in the ratings produced by one of the lead business magazines (e.g. Fortune, Business Week, Forbes - but I don't remember the one). This magazine apparently surveyed students at these schools on the quality of their teaching and weighed this heavily in their rankings. The Michigan B School which typically ranked somewhere in the teens decided to emphasize teaching quality in hiring and promotion and was rewarded a few years later by a number two ranking; students were rewarded every day by the focus on good teaching and efforts made to help faculty become better teachers.
The idea of an ABA or AALS ranking system is not a bad one if (a) the factors weighed are factors that matter to legal education and we want schools to try to give them greater emphasis; (b) there are separate rankings along different non-commensurate relevant dimensions and (c) overall ranking is done by groupings rather than by creating false distinctions between schools that attract largely similar students and have faculties of more or less similar strength.

Rick

The comments to this entry are closed.

Conferences

January 2025

Sun Mon Tue Wed Thu Fri Sat
      1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31  

Site Meter


Creative Commons License


  • Creative Commons License
Blog powered by Typepad