My ELS Ranking includes a proxy for empirical research output. The original ranking counted the number of articles by a law school’s faculty in 13 peer-reviewed, interdisciplinary journals associated with ELS (for years 2000-2004). This measure drew by far the most criticism. The complaints generally were: 1) the proxy favors economics-related work because a disproportionate number of the journals are in that field, 2) it ignored the primary outlet for ELS – and other law scholarship – general law reviews, and 3) it included articles that lacked quantitative empirical work. I agreed that the measure was flawed, but the time required to create a new measure and collect relevant data was too great in light of the publication deadline that I faced. The 2005 ELS Ranking couldn’t be changed.
An updated ELS Ranking gave me the chance to fix flaws, or at least to fix the one most glaring one: the empirical research output measure. Mark Lemley suggested computing the number of works referring to “statistical significance.” Building on work by Robert Ellickson, I had used that search term in 2005 to look at the ELS trend as reflected in Westlaw’s JLR database. That phrase (and its variants) may be the best proxy for ELS work because the expression is most likely to appear in articles that include quantitative analysis. It still has shortcomings: ELS work may refer instead to the particular test statistic used. Or the article may describe the concept in more meaningful terms for a lay audience and to avoid confusion with practical significance. But, on the whole, it is a good measure, decreasing false positives. And, it also should decrease false negatives by looking at all journals in JLR (see Frank Cross’s 9/25 comment).
The 2006 ELS Ranking measures empirical scholarship based on references to “statistic! /1 significan!” in Westlaw JLR articles published since 1996. Each law school’s score is the sum of all articles by its research faculty divided by the total number of research faculty, giving a per capita ELS publication rate (reflected in parentheses).
1. Cornell (2.34)
2. Chicago (2.15)
3. Vanderbilt (1.67)
4. Northwestern (1.57)
5. UCLA (1.23)
6. Illinois and Stanford (1.20)
8. Yale (1.16)
9. Texas (.98)
10. George Mason (.97)
11. Harvard and Pennsylvania (.93)
13. Duke (.87)
14. Ohio State (.81)
15. NYU (.78)
16. Columbia and Wake Forest (.70)
18. USC (.67)
19. BYU (.64)
20. UC-Berkeley (.62)
What do you think of this new measure? It will impact the overall rankings as we’ll see tomorrow. As a final note, I again want to thank my research assistant Geoff Turvey.