Cornell colleagues, Ted Eisenberg and Marty Wells, empirically analyze leading ranking metrics for refereed law journals in their recent paper, Ranking Law Journals and the Limits of Journal Citation Reports. Their analysis of ranking outcomes emphasizes a pre-occupation with ordinal ranking and database bias. The abstract follows:
"Rankings of schools, scholars, and journals emphasize ordinal rank. Journal rankings published by Journal Citation Reports (JCR) are widely used to assess research quality, which influences important decisions by academic departments, universities, and countries. We study refereed law journal rankings by JCR, Washington and Lee Law Library (W&L), and the Australian Research Council (ARC). Both JCR’s and W&L’s multiple measures of journals can be represented by a single latent factor. Yet JCR’s rankings are uncorrelated with W&L’s. The differences appear to be attributable to underrepresentation of law journals in JCR’s database. We illustrate the effects of database bias on rankings through case studies of three elite journals, the Journal of Law & Economics, Supreme Court Review, and the American Law & Economics Review. Cluster analysis is a supplement to ordinal ranking and we report the results of a cluster analysis of law journals. The ARC does organize journals into four large groups and provides generally reasonable rankings of journals. But anomalies exist that could be avoided by checking the ARC groups against citation-based measures. Entities that rank should use their data to provide meaningful clusters rather than providing only ordinal ranks."
Comments