First of all, thanks so much to ELS for taking the time to have a blog forum on my empirical study of the relationship between teaching and scholarship at American law schools, and for Bill and Jeff for agreeing to comment on the study. I can say at the outset that there were many times over the last three years that I wondered whether I would ever make it to the finish line on this project. The biggest problem was gathering the teaching evaluation data. The University of Tennessee's data is publicly available on line (although in a very discrete and hard to find location), so I thought I would have little difficulty gathering data from 19 other American law schools. As it turned out I needed to contact every law school dean and associate dean, and then file 12 State FOIA requests before I reached 19 participating schools. I had originally aimed for 20 schools, but I would have had to file a law suit to make 20, so I called it a day at 19.
The difficulties I faced in gathering the data actually helped prepare me for my counter-intuitive finding of no correlation, and also for some of the critiques I've received. As you might guess, anyone who thinks that teaching evaluations are bunk has little use for the study. As you might also guess, these same folks have not been shy in sharing that opinion! I've presented the paper at 3 of the participating schools and the SEALS conference, and have gotten some great feedback. Nevertheless, I always face at least a partially skeptical audience. Thus, I am most interested to hear what Jeff and Bill (and any Blog readers or commenters) have to say on the subject. I am also quite interested in how people view the ramifications of the study. If it is true that teaching and research are not correlated, what (if anything) does that tell us about law school hiring, management, and rankings?