A rush of deadlines have cramped my blogging over the last few weeks, including one for the AALS Workshop on the Ratings forum. My co-author, Andy Morriss, was one of the panelists.
The responses to the various rankings panels and breakout sessions elicited lots of frustration and high horse moralizing. As someone who has spent a fair amount of time examining rankings from an empirical perspective, I have come to one clear conclusion: normative arguments, no matter how cogent and persuasive as a matter of principle, are simply irrelevant. Rankings have whipsawed the legal academy because of institutional self-interest and a collective action problem.
The chart below, which is from our Measuring Outcomes paper, is a strong piece of evidence for how we (the legal academy) have lost our way. In 1997, U.S. News ranking added bar passage to its placement methodology. This coincided with a movement toward higher bar cut scores. (See Gary Rosin's Unpacking the Bar for the implications of this trend.) Bar results have the advantage of being verifiable. In contrast, that same year, U.S. News dropped the salary component from its methodology because of evidence that schools were routinely inflating their data.
Andy and I had anecdotal evidence that schools responded to this pressure by increasing 1L attrition--i.e., imposing a strict curve that guarantees that a fixed percentage of students will flunk out. Because 1L grades are strongly correlated with bar passage scores, this policy could boost bar passage. According to data posted on the ABA website, between 1997 and 2004, law school attrition was up 20%. But when the numbers were disaggregated, the following pattern emerged.
Obviously, 1L attrition has gone up. Fortunately, since 1997, the annual ABA-LSAC Official Guide to Law Schools has published uniform data in a uniform format for each law school, including a breakdown of attrition. (These changes were a response to complaints by U.S. News that law schools were lying to them.)
Andy and I gathered all the attrition data from the 1998 and 2005 editions (published in 1997 and 2004) and ran some paired-sample t-tests to determine whether the increases were statistically significant. The table below, which is weighted by class size, summarizes our results. (Note, to be in the table, a school had to be ranked in 1997. So the results are not affected by the rush of new law schools.)
The change in academic attrition is statistically significant at p < .015. So the Wigmore strategy appears to be returning. Yet, the change in "other attrition" is up even more. What is driving these results? Andy and I think it is the transfer student gaming strategy. In a nutshell, Elite school A or Tier 1 school B shrinks its 1L class, gets an LSAT boost, and makes up the revenue by admitting more transfer students, whose credentials are irrelevant for U.S. News purposes.
As was discussed at the Rankings forum, many law school now actively solicit transfer students through various direct mailings. I doubt this was happening much, if at all, ten years ago. The Law School Survey of Student Engagement documents that transfer students tend to be more socially isolated than their peers who began as 1Ls.
Andy and I recommend that ALL outcome/placement data be released into the public domain: number of firms interviewing on campus; employment rates by practice setting; salaries by practice setting; MBE scores, controlling for entering credentials. This would at least force law schools to compete on value-added to students, rather than gaming. I wonder if the ABA reads this blog?