The following is story of what I believe to be the first ranking of all U.S. Law Schools. (If I am mistaken, please email me additional facts and I will post them.) As a service to the law school community, I have scanned and linked to the relevant articles. Otherwise, they would be hidden away in the stacks of most law school libraries. I hope readers find this account interesting and enjoyable.
The story begins with a magazine. In the early 1970s, the ABA Section on Legal Education and Admission to the Bar attempted to elicit a dialogue between the legal academy and practitioners by starting a periodical called Learning & the Law. Its editor was a law professor named Charles Kelso (now emeritus at McGeorge), a person of prodigious energy who served on numerous ABA inspection teams during the 1960s and '70s. Before ending its publication, Learning & the Law put out four terrific, interesting volumes (1974-77) that tapped many of the nation's leading academics and law school deans to write about of issues affecting law, society, pedagogy, legal education and the practicing bar.
In the summer 1975 , Charles Kelso drew upon data collected by the ABA to publish a rating of law school resources--e.g., faculty size, student/faculty ratio, volumes in library, etc. See Adding Up The Law Schools: A Tabulation and Rating of their Resources. The idea for the Resource Index emanated from Kelso's observation that schools with part-time programs tended to have fewer resources per capita and, in general, focused on training competent practitioners rather producing scholarship, championing social causes, or providing an advanced liberal education. The compilation of such a list, reasoned Kelso, could help ABA inspection teams evaluate resources vis-a-vis law schools' stated social and educational missions.
The first paragraph of the article explicitly stated, "[This list] is not a quality rating of law schools" (emphasis in original). But it didn't matter. The article listed all U.S. law schools in descending order of resources. Further, each school was placed in one of seven groups (you know, like law school tiers) and assigned a resource grade of A, B, or C. Groups 1 and 2 were designated an "A"; Groups 3, 4, and 5 were given a "B"; and Groups 6 and 7 were given an "C". Kelso also noted that a recent listing of top law schools [based on survey data from law school deans from Change magazine, Winter 1974-75] closely tracked the top of the Resource Index.
In the weeks that followed, Learning & the Law was besieged with letters, many from law school deans, complaining about the inaccuracies of Kelso's methodology. The Fall 1975 issue published several critical letters and a comprehensive defense by Kelso. See In Defense of the Mathematics of Law School Evaluation. For a summary of that provocative exchange, you'll have to read past the jump.
With the benefit of hindsight, two unmistakable impressions emerge from the Fall 1975 issue of Learning & the Law: (1) Charles Kelso honestly believed that careful evaluation of law school resources and missions could improve the quality and cost of legal education for all relevant constituencies; and (2) the collection and presentation of that data in rank order was akin to letting the genie out of the bottle--no matter what Kelso said about the use and meaning of the "Resource Index", it was going to be perceived by the law school community as a ranking of quality.
The beginning of Kelso's Defense essay reads as follows:
The dean of a Midwestern law school has written to suggest that the record be set straight on what inference can and cannot properly be drawn from the Resource Index published in the Summer 1975 issue of Learning. This dean and several others have predicted that the Index will be misused to give the impression that the ABA is now in the business of ranking law schools and that the results of the Index methodology may be misinterpreted to imply a single-dimensioned "quality" ranking for law schools. ...
It should be restated that the Index "is not a quality rating of the law schools." The position of a school of the Resource Index is not an assessment of the relative academic or professional benefits that particular students may obtain by attending particular schools. The Index does not speak to research output or to the significance in the intellectual or legal life of the nation. ... Further, the ranking is only an approximation of relative position.
An "approximation" indeed. And because a definitive list of relative position did not exist, the Resource Index had the potential to serve as a single, gigantic anchor.
The responses by the deans were particularly telling. They did not object so much to the aggregation of the data and the distortions inherent in the quantitative process. Cf. Sauder & Lancaster, Do Rankings Matter? The Effects of U.S. News & World Report Rankings on the Admissions Process of Law Schools, 40 Law & Soc'y Rev. 105 (2006). Rather, the comments displayed a dissatisfaction (or occasional glee) with the results. In other words, "if the methodology were corrected, the Index would properly reflect [School X's] true position."
For example, Dean Norman Redlich of NYU Law, which has placed #8 in the Change magazine reputational ranking, complained that the Kelso methodology failed to take into account his schools 86 adjuncts, which enabled the school to provide a rich array of LLM specialty programs. Redlich also quibbled with accounting for per capita library resources. Redlich concluded, "Adjusting Professor Kelso's rating accurately to reflect just the student/faculty ratio would lift New York University to his Group 1 ("A"), which is his highest category."
Dean Frederick M. Hart of University of New Mexico complained that the Kelso methodology downgraded smaller facilities when, in fact, these attributes were New Mexico's greatest strength. Dean Hart thus wrote (we assume partially tongue-in-cheek):
To downgrade us because of our small size is similar to penalizing the Dallas Cowboys because of "Too Tall Jones's " height. His height makes him a better football player and Dallas a better team; our size makes us a better law school. ... [When the Kelso methodology is corrected, it] clearly demonstrates that we are the best law school in the country.
Professor Peyton Neal, who was on the faculty at another small school, Washington & Law, also complained about a probable size bias (but then why did Georgetown and NYU fare so poorly?):
I seriously question the fact that your analysis appears to place a premium on larger schools. While I believe that it may be probable that some of the larger schools are better in many respects, I question whether some of our smaller schools may indeed be in a superior position when all applicable statistics are considered realistically.
My favorite letter was written by Dean Monroe Freedman of Hofstra University School of Law, which was on the winning end of the Index game:
Hofstra Law School has been inundated by letters and telephone calls congratulating us, ever since the publication of Professor Kelso's Resource Index for rating law schools, published by the American Bar Association and Learning and the Law. On the Kelso Index, Hofstra is ahead of such schools as Georgetown and Boston University, tied with Boston College and George Washington, and only three points behind N.Y.U. ...
[But other important criteria has been omitted,] and Hofstra Law School, in particular, would show up even closer to the top of the list if these factors were appropriately taken into account.
Perhaps Kelso could argue that all of these academics were drawing improper inferences--that the real purpose of the Index was to create a means to diversify and ultimately improve legal education. But his Defense essay also cited numerous other data showing that a favorable Index score was strongly correlated with other desirable attributes, such as range of elective courses, clinical offerings, better placement in larger firms, more sophisticated teaching styles, and better credentialed and more prolific faculties who are more likely to be members of the American Law Institute. An inference of relative quality was therefore irresistible.
Charles Kelso concluded his essay by arguing that improvement in legal education had been stymied by lack of data needed to assess the relationship the law school programs and various measures of law school success. Thus, Kelso reasoned, the Resource Index should be compiled on an ongoing basis along with "an ongoing program of data collection about goals, programs, classroom interactions, exit data, and alumni career and competency information."
The ABA apparently did not share Kelso's optimism or vision. The Resource Index was not reproduced the following year. And by 1977, Learning & the Law ceased publication. Fifteen years later, U.S. News & World Report stumbled upon the concept of aggregating similar information for commercial gain. And as the various labor markets have anchored on these figures, virtually every law school in the country allocates millions of dollars in resources to jockey for position in a competition that is completely unhinged from educational merit and lawyer training.
There is a simple way to battle the influence of U.S. News--the ABA could complete Kelso's vision by collecting data and sponsoring research on the relationship between law school programs and various measures of lawyer success. When prospective students know how their tuition dollars relate to their longer term personal and professional goals, these empirical facts will have greater force than the U.S. News "approximation[s]" of educational quality. Students will vote with their feet.
But do law school really want to compete on the basis of value-added for students? Or do we want to maximize things that matter to us (law school faculty), like scholarly prestige? I think the answer is obvious. Hence, the U.S. News stranglehold on legal education will continue until law schools start going bankrupt. And when we examine the upward trends in law school tuition, that day of reckoning may not be far off for many law schools.