It's been fun guest blogging this week, but this is likely my last post. I'm going to wade into the ongoing discussion here and elsewhere on what kinds of statistical and social science background law students should get (or be offered) and lawyers need. The easy answer to those questions, not likely to be controversial here, is more than they get! Perhaps most useful to the students are those classes that focus on the types of numeracy and related skills that lawyers need when dealing with expert witnesses.
But I don't think that teaching about such matters should be restricted to such classes or to classes covering, for example, empirical studies of judicial decisionmaking. In my employment law class, for example, I cover the law governing the use of polygraph tests and other kinds of screening tests by employers. While I'm at it, however, I walk the students through a (very simple) set of exercises aimed at helping them understand the statistics of screening tests. What does it mean to say that a test is 90% accurate, or that it has a specificity of 95% but a sensitivity of only 70%? Why might you want to know something about the prevalence or expected prevalence of whatever it is you are screening for? How do you interpret a "positive" result? Invariably, most of my students find this discussion extremely intimidating and even panic-inducing. ("I didn't realize we'd have to do math in this class!" or "Is this going to be on the final?" are the two most common responses.) Yet an employment lawyer advising a client on whether or not it is worth the legal hassles or potential hassles associated with using and relying on different types of screening tests simply cannot give competent advice without knowing how to think about these questions -- or at least without knowing that those questions exist and need to be answered.
I'm sure most readers of this blog wouldn't disagree and are likely to find ways to include numeracy if they teach doctrinal classes. I'm also sure many of our colleagues who do not consider themselves ELS devotees do so as well. It would be interesting to know how much this happens or, to the extent it doesn't, do law professors feel that more would be helpful but may not themselves feel competent in presenting the information. I believe that statistical and social science information closely related to the doctrinal subjects already taught may be the easiest to "sell" to faculty and students alike -- certainly easier sells than, say, empirical studies of judicial decisionmaking or general statistics classes that are not directly tied to application in they study of doctrine or the practice of law.
I've always wondered how people in law schools handle this conundrum. Because that is what I think it is. The basic problem is that the law wants exact results and all that statistics can give - and that only with heroic assumptions and the right circumstances - is levels of precision. I've been consulted by attorneys many times on statistical questions and the problem is always the same: they want to know if there is a "statistically significant result". What they mean by that is a result that reaches the sacred "p<.05" level or "better". When I inform them of what is necessary to make even such a relatively useless finding believable their reaction is always impatience with my "pussyfooting". When I go further and explain what statistical significance actually connotes they get downright feisty about it. "But that's the standard the courts have recognized! It's in the precedents! That's all I need to worry about!"
And they're right. I suspect that the use of more sophisticated techniques and of a truely scientific approach to legal questions is considered MEGO material in law schools, unless, as Bill says, it is tied to research about the profession itself.
I wouldn't worry about this if there wasn't clear evidence by now that the courts are falling WAY behind the curve in the use of statistical evidence to reach decisions on important matters. There are, in fact, a good number of questions that more developed technique and a more analytic view of statistics could usefully address. I see very little evidence that the courts are moving towards using either. The disconnect this could lead to between empirically oriented law school faculty and a still groping bench isn't healthy either.
But I don't teach in law school. Maybe I'm just seeing monsters in the closet.
Posted by: Tracy Lightcap | 05 March 2007 at 12:33 PM
Hi Carolyn, thanks for an interesting week of guest blogging.
Re innumeracy among law students, I think it is important to emphasize to students that a lawyer will prosper if he or she is willing to learn and master new subject matter; in some cases, this will involve numbers. I worry about my students who cannot figure out how many directors seats are needed to control a seven member corporate board (four!). They clearly have the analytical ability to accomplish this task; perhaps many have opted out of any serious math since the 12th grade, so even basic applied arithmetic becomes a challenge.
Re empirical methods in the classroom, I have found that students are more than willing to learn about correlation coefficients and t-tests when the subject matter is lawyers and law firms--of course, the answers to the hypothesis tests have direct bearing on their future careers. My colleague at a Top 5 law school has reported the same observation.
Thanks for your contributions this past week. bh.
Posted by: William Henderson | 05 March 2007 at 08:01 AM