My ELS Ranking includes a proxy for empirical research output. The original ranking counted the number of articles by a law school’s faculty in 13 peer-reviewed, interdisciplinary journals associated with ELS (for years 2000-2004). This measure drew by far the most criticism. The complaints generally were: 1) the proxy favors economics-related work because a disproportionate number of the journals are in that field, 2) it ignored the primary outlet for ELS – and other law scholarship – general law reviews, and 3) it included articles that lacked quantitative empirical work. I agreed that the measure was flawed, but the time required to create a new measure and collect relevant data was too great in light of the publication deadline that I faced. The 2005 ELS Ranking couldn’t be changed.
An updated ELS Ranking gave me the chance to fix flaws, or at least to fix the one most glaring one: the empirical research output measure. Mark Lemley suggested computing the number of works referring to “statistical significance.” Building on work by Robert Ellickson, I had used that search term in 2005 to look at the ELS trend as reflected in Westlaw’s JLR database. That phrase (and its variants) may be the best proxy for ELS work because the expression is most likely to appear in articles that include quantitative analysis. It still has shortcomings: ELS work may refer instead to the particular test statistic used. Or the article may describe the concept in more meaningful terms for a lay audience and to avoid confusion with practical significance. But, on the whole, it is a good measure, decreasing false positives. And, it also should decrease false negatives by looking at all journals in JLR (see Frank Cross’s 9/25 comment).
The 2006 ELS Ranking measures empirical scholarship based on references to “statistic! /1 significan!” in Westlaw JLR articles published since 1996. Each law school’s score is the sum of all articles by its research faculty divided by the total number of research faculty, giving a per capita ELS publication rate (reflected in parentheses).
1. Cornell (2.34)
2. Chicago (2.15)
3. Vanderbilt (1.67)
4. Northwestern (1.57)
5. UCLA (1.23)
6. Illinois and Stanford (1.20)
8. Yale (1.16)
9. Texas (.98)
10. George Mason (.97)
11. Harvard and Pennsylvania (.93)
13. Duke (.87)
14. Ohio State (.81)
15. NYU (.78)
16. Columbia and Wake Forest (.70)
18. USC (.67)
19. BYU (.64)
20. UC-Berkeley (.62)
What do you think of this new measure? It will impact the overall rankings as we’ll see tomorrow. As a final note, I again want to thank my research assistant Geoff Turvey.
I am hoping that the current flurry of interest in empirical research among law schools will proceed with careful attention to high standards and will recognize the relevance and importance of both qualitative and quantitative work.
Posted by: buy viagra | 29 March 2010 at 12:12 PM
I heartily concur with this opinion from two highly respected empirical researchers. Like JSP and the CSL&S, the Institute for Legal Studies -- established in 1985 at the University of Wisconsin Law School -- developed from a longstanding institutional interest in combining law and social science –- a tradition which dates back at least to 1914, when a Wisconsin professor gave a paper on Eugen Ehrlich’s “Living Law.” Like Edelman and Simon, I am hoping that the current flurry of interest in empirical research among law schools will proceed with careful attention to high standards and will recognize the relevance and importance of both qualitative and quantitative work. Careful work will lead to more tentative conclusions, which will in turn capture fewer headlines. But it will make the best use of what social science has to offer law.
Posted by: Howard Erlanger | 11 October 2006 at 08:59 PM
As participants in two of the oldest programs in empirical legal studies, the Center for the Study of Law & Society (est. 1966), and the Jurisprudence and Social Policy program (est. 1978), both at UC Berkeley, we could not be more pleased at the energetic burst of empirical work on law represented in this ranking survey. It has taken alarms going off every two or three decades but it appears that academic law in the US is awakening from its analytic slumber. Nonetheless we believe that the survey methodology was flawed and gives a distorted picture of the academic landscape and more importantly a misleading idea of what empiricism in legal studies represents.
First, no survey is better than its sample and there are real limitations with using Westlaw's JLR database as a measure of productivity in empirical legal studies. Most importantly, the JLR database does not include the top disciplinary social science journals in which many empirical legal scholars publish their empirical articles (including the American Journal of Sociology, the American Sociological Review, the American Economic Review and the American Political Science Review). Scholars with social science training often prefer these journals because they are peer-reviewed. The JLR database not surprisingly reflects the overall market for academic legal source material, inevitably a lagging indicator of any empirical turn in that field.
Second, searching for the term "statistical significance" is a very poor measure of empirical scholarship. The Westlaw search term that was apparently used in this search (“statistic! /1 significan!”) would miss the many empirical articles that use measures of statistical significance but do not use the term ‘statistic’ (or its variants) within one word of ‘significant’ or its variants. It is common to discuss a finding and then to put the significance level in parentheses (e.g. “p<.05”), or simply to write "significantly different from" or even "different from" or any number of other similar phrases. It is also common to report significance levels in tables only, or to report standard errors (so that the reader can calculate the degree of statistical significance), without using the term “statistical significance.” Of course, quantitative analyses of a population rather than a sample would not need to report statistical significance to address sampling error (although some scholars do so anyway).
Third, searching for any term that might indicate empirical scholarship tells us nothing about the quality of that scholarship. As Epstein and King noted in “The Rules of Inference” (69 University of Chicago Law Review 1), empirical legal scholarship varies substantially in quality, and work published in law reviews can suffer from a lack of peer review.
Finally, and this is a much more general problem, the survey assumes that only quantitatively based studies are truly empirical. Quantitative studies (using proper samples and methodology) can provide general descriptive and inferential knowledge, for example about the level of a particular effect from a particular socio-legal dynamic). But qualitative empirical studies using interview, ethnographic, observational, archival or similar methodologies should be viewed as a critical part of the empirical landscape as well because they can offer important insights into the dynamics underlying the effects demonstrated through quantitative work and can be useful in generating hypotheses to be tested using quantitative work. Consider, for example, Robert Ellickson’s study of dispute resolution among Shasta County ranchers and farmers (Order Without Law); Patricia Ewick and Susan Silbey’s empirical studies of legal consciousness (The Common Place of Law); Michael McCann’s analysis of the legal mobilization of rights in the pay equity context (Rights at Work); Catherine Albiston’s empirical analysis of how institutionalized conceptions of work, gender, and disability shape workers’ mobilization of their rights under the Family and Medical Leave Act (39 Law & Society Review 11); Nicholas Pedriana’s analysis of legal framing processes in the women’s movement (111 American Journal of Sociology 1718); and numerous other studies that enrich our understanding of legal processes due to careful analysis of qualitative data. (We disagree with an earlier commentator who wrote that practically all legal scholarship includes qualitative empirical analysis – it is important to distinguish work that uses qualitative data and proper empirical methodology from traditional doctrinal or normative analyses, which often do not involve any empirical analysis).
Lauren Edelman, Director, Center for the Study of Law & Society, UC Berkeley
Jonathan Simon, Associate Dean for Jurisprudence and Social Policy, Boalt Hall, School of Law, UC Berkeley
Posted by: Jonathan Simon | 05 October 2006 at 01:39 PM
Hm. I guess my concern would be that -- on the other, hyper-quantitative end of the spectrum -- fewer and fewer well-trained social scientists are buying into the whole "statistical significance" / null-hypothesis-testing paradigm of inference. For a good explanation of why that is, look here:
http://psblade.ucdavis.edu/papers/hypo.pdf
That means that the measure will increasingly miss instances where people are, in fact doing statistical/empirical work but not using the terminology.
Posted by: Christopher Zorn | 01 October 2006 at 05:33 PM
it's an empirical question whether "els" refers only to quant work, but in my view it shouldn't. and i don't think it is quite accurate to say that nearly all legal scholarship includes qualitative empirical analysis. much is simply textual, doctrinal, or otherwise concerned with particular legal rules (themselves empirical phenomena, but of a very special sort) or is theoretical in nature. actual qualitative study of the effects of legal rules, for instance, isnt all that common in my experience. and what does exist is rarely methodologically self-conscious.
Posted by: kal raustiala | 29 September 2006 at 07:47 PM
Kal clearly is correct that the term "empirical" includes both qualitative and quantitative work. However, "empirical legal scholarship" generally refers to work with quantitative analysis as a central component, including both experimental and observational data. When law professors talk about the rise of ELS, they usually are referring to this quantitative work only. Almost all legal scholarship includes qualitative empirical analysis.
Posted by: Tracey George | 29 September 2006 at 03:39 PM
i'm glad to see my colleagues score well on this particular measure but the fact remains that empirical scholarship and quantitative analysis are not the same thing. in the social sciences we generally consider qualitative work to be equally empirical (though we argue, at least in political science, where i come from, about whether it is as rigorous). so to equate empiricism with regressions is a bit odd.
Posted by: kal raustiala | 29 September 2006 at 02:57 PM
I have some suspicion that the false positives are larger than suggested by the reference, "Or the article may describe the concept in more meaningful terms for a lay audience and to avoid confusion with practical significance." If an article merely includes a footnote identifying another's work followed by a parenthetical such as "finding a statistically significant relationship", the article is classified as empirical. A brief scan in "kwic" view of a few results of this search suggests the majority of articles identified by this search may well not be empirical.
Posted by: Royce Barondes | 29 September 2006 at 11:36 AM
Chris should get extra points for writing with Jeff.
Posted by: frank cross | 28 September 2006 at 01:56 PM
I ran separate searches for each school, listing each faculty member as a possible author. Thus, I don't have information on all hits.
What percentage of those hits are at top 50 schools? I can only give you an estimate. The top 50 schools have a total of 1503 citations. However, these are not unique citations due to cross-school collaborations. Thus, Jeff Rachlinski (Cornell) and Chris Guthrie (Vanderbilt) write an article discussing statistical significance, and both schools receive credit for one citation. (Collaboration within a school, however, does not result in multiple credit to that school for one article.) Collaboration is more common in ELS than in other work. Thus, I wouldn't be surprised if top-50 law schools accounted for only 20% of all hits.
Posted by: Tracey George | 28 September 2006 at 01:18 PM
I, for the most part, like the measure. I just ran the search on Westlaw -- STATISTIC! /1 SIGNIFICAN! & DA(AFT 1/31/1995 & BEF 9/28/2006 -- and received 6845 hits. Did you code all of these for school affliation and whether the author is part of the research faculty"? If so, do you have the rankings for all the schools that show up? Do any lower ranked schools (based on U.S. News) perform comparatively better under this measure?
Posted by: Jason Czarnezki | 28 September 2006 at 12:59 PM