I stumbled across what could be a promising archival project involving the California appellate courts:
In 2006, the California Appellate Court Legacy Project was
undertaken to interview all retired justices in the state, as well as active
justices who may be nearing retirement. Overseen by the Appellate Court Legacy
Project Committee (chaired by Associate Justice Judith L. Haller of the Fourth
Appellate District, Division One), interviews are videotaped or audiotaped and
conducted by interviewers selected from within the appellate branch. Ultimately
the tapes will be made available to judicial colleagues, historians, scholars,
law students, and members of the public. The resulting archive will be an
historical record of both the personal experiences of individual justices and
the evolution of the California appellate courts.
Unfortunately, I haven't found any information about the project's
current status or what types of questions the interviewers are asking the judges. Actually, I can't find much of anything about this project other than the statement above, which comes from the California Courts' website.
New ICPSR data releases here. One that caught my eye is the 2004 update for the National Judicial Reporting Program Series. Although the 2004 sample is a cross-section, the series enables longitudinal analysis from 1986 to 2004. Here is a description for the 2004 dataset:
This data collection provides detailed information on the
sentences and characteristics of convicted felons based on data
collected from state courts. The 2004 survey was based on a sample of
300 counties selected to be nationally representative. The collection
contains sociodemographic information such as age, race, and sex of
the felon. Types of offenses committed include homicide, rape, and
robbery. Data can be analyzed at the national level or by individual
There is a lot of county-level demographic data available from the Census Bureau, including shape files for GIS mapping software. All of these data sources can be linked together in a matter of hours thanks to Census Bureau FIPS codes. For a criminal law scholar interested in empirical research, this is great (and free) starter data set.
College football fans, coaches, and observers have adopted a set of
beliefs about how college football poll voters behave. I document three
pieces of conventional wisdom in college football regarding the timing
of wins and losses, the value of playing strong opponents, and the
value of winning by wide margins. Using a unique data set with 25 years
of AP poll results, I test college football's conventional wisdom. In
particular, I test (1) whether it is better to lose early or late in
the season, (2) whether teams benefit from playing stronger opponents,
and (3) whether teams are rewarded for winning by large margins.
Contrary to conventional wisdom, I find that (1) it is better to lose
later in the season than earlier, (2) AP voters do not pay attention to
the strength of a defeated opponent, and (3) the benefit of winning by
a large margin is negligible. I conclude by noting how these results
inform debates about a potential playoff in college football.
Today I am in California for a hearing before the Board of Governors, which is the elected body that oversees the state bar exam. For the past 18 months, I have been part of a research team (with Richard Sander, Vik Amar, Doug Williams, and Stephen Klein) seeking California data to test a possible mismatch effect in law schools. The substance of the hearing today (see agenda here) was to determine whether we would be permitted to obtain regression results, via the Bar's own longtime psychometrician, from the Bar's rich archive of historical data.
After hearing approximately 50 minutes of prepared testimony from those for and against our study, the Board voted to deny our data request. Based on the tenor of the meeting, which was set by a well-organized opposition led by Cheryl Harris (UCLA Law) and Michele Landis Dauber (Stanford Law), the vote turned on legal uncertainties surrounding candidate confidentiality and consent.
Acceding to these concerns has, in my opinion,
three major flaws: (a) 0ur research team would only have access to
regression tables with results aggregated among groups of schools, so
no individually identifiable information would ever be released to
researchers or the public; (b) we would, at all times, be subject to
university IRB protocols; and (c) the broad construction given to
consent in this context suggests that much of the research sponsored by
the California Bar over the last 20 years has been unlawful; moreover,
this decision casts doubt on the scope of any future research by the
[Note: Richard Lempert (Michigan Law and formerly chair of the NSF Law & Social Science Program), wrote this memo on the IRB issue, which he graciously forwarded to our research team in advance of the hearing. Rick's assessment, however, is quite different from my own experience in obtaining IRB approval at three different institutions for projects which raised, in my opinion, much more significant human subjects issues. Regardless, no research would occur without IRB approval; that should be obvious to all parties involved.]
Of course, the Board of Governors is not an adjudicative body -- rather, in this
case, it is the regulatory unit that would be dragged into court if it
approved our proposal. Thus, on one level, I can understand the institutional
reasons for taking a cautious approach. (For a snapshot of the public debate leading up to this decision, see this National Law Journal op-ed by Cheryl Harris and Walter Allen, and our NLJ response; in the spirit of transparency, all documents related to our projects are posted on this website.)
Although I am disappointed with the outcome, I am proud of the testimony we put on today. The basic theory underlying the mismatch effect is that recipients of large preferences (regardless of race) tend to learn less because the classroom instruction is pitched toward a modal student with significantly higher entering credentials. Honest researchers can argue over the methodology and inferences of Sander's Systemic Analysis study. But existing data provide little comfort that our current system is working well. Minority students are disproportionately clustered at the bottom of their class. Since LGPA is the single best predictor of bar passage, it is not surprising that these same students struggle on the bar examination. Sander's first choice/second choice analysis (in his 2005 Stan. L. Rev. response essay) suggests that law school performance and bar passage prospects actually increase when a student opts for a less elite law school. In my opinion, if this theory is wrong, it is imperative that it be debunked empirically rather than through efforts to withhold data.
As a lifelong Democrat and ardent supporter of racial diversity, I don't need to apologize for raising these issues. As a said in my prepared remarks, it does not follow that evidence of a mismatch effect requires the dismantling of racial preferences -- indeed, in my opinion, this would be a very bad idea.
But if we fail to diagnosis factors that contribute to low minority bar passage, we have no basis to formulate effective policy or educational strategies. Regardless of which way the data cut, our study would have guaranteed one of two favorable outcomes:
Using the more refined California Bar exam (i.e., a continuous dependent variable rather than pass/fail and schools put in analytically useful clusters, unlike the LSAC-BPS data), the mismatch theory would have little or no empirical support. So a contentious academic theory would be put to bed, at least in the law school context.
A mismatch effect would be supported, which would pressure law schools to take concrete steps to help current or prospective students. These might include: (a) disclosures that reveal bar passage prospects for past students with similar entering credentials; (b) creation of rigorous academic support programs (such as this one) that increase the bar passage for students in the bottom 1/2 of the class; and (c) identify curricular and teaching strategies that produce higher bar exam scores.
As legal educators, we should want more than the status quo.
Tom Brunell (UT-Dallas) and Alec Stone Sweet (Yale Law) have updated their important data on cases in the European Court of Justice (ECJ). The data now cover ECJ cases from the 1950s through 2006. They've also expanded the data to include actions brought under Article 226 (infringement proceedings - where the Commission brings suit against a member state) and Article 230 (annulments - where an individual or group sues the Commission itself), in addition to their original Article 234 (preliminary references) data. Jason has kindly added a link to the "Databases" bar on the right; look for the "Litigating EU Law Data" link.
Entering credentials of transfer students are irrelevant to U.S. News ranking calculations. As a result, many have speculated that transfers are fertile ground for U.S. News gaming. This post examines data on 2L transfers, which the ABA recently collected and released in the 2008 Official Guide. In short, there are a lot of transfers, particularly to schools with high U.S. News rankings. Here the breakdown by US News Tier:
Why am I posting on transfer students? That is an important subtext. During a recent conversation, hiring partners at one of the nation's largest and most prestigious firms (biglaw X) complained to me that they were running across a lot of transfer students at elite law school Y (where pre-screening is not permitted) with nondistinguished 1L records from Tier 2 or 3 schools. As a follow-up to our conversation, I crunched the numbers from the 2008 Official Guide.
But the more salient issue is how firm X responded in the absence of hard data. Because the transfer students could not meet the grade cutoff for an interview at their old school, (a) they did not receive a callback, (b) firm X wasted 30 minutes x 2 lawyers for each incident, and (c) a perception took root that elite law school Y was diluting its product through an unprincipled transfer policy designed to "sell" 2L admission at full tuition price in order to constrict and subsidize a smaller crop of 1L students with higher LSAT scores--and through the process, increase its US News ranking. The hiring partners had a clear, unflinching grasp of the underlying dynamics. Did administrators at law school Y really believe that no one would notice?
In our Student Quality as Measured by LSAT Scores study, Andy Morriss and I theorized that a heavy intake of transfer students was likely the preferred gaming strategy of high ranked schools (lower ranked schools, in contrast, relied upon part-time programs). For schools who were in the U.S. News first quartile in 1992, constriction of the 1L full-time class over the next 12 years was associated with larger gains in median LSAT scores. (At the time, we lacked the data on transfers, which we thought would reveal a revenue-neutral strategy that worked in lockstep with 1L constriction.)
But here are the difficult institutional questions:
Reputation. Is a slight increase in median 1L LSAT
scores worth the damage to a law school reputation among employers? Arguably law school Y's biggest mistake was taking all comers rather than building a larger pool of transfer applicants. Apparently, some Tier 1 and 2 law schools actively solicit transfer applicants and thus can be more selective. Over time, this could easily evolve into a second, substantial admissions process.
Payoffs to the school. Although students and faculty may be jubilant over a bump in the rankings from #14 to #12, or #22 to #19, are more opportunities thereby available to graduates, especially if the gaming strategy is being flagged by key employers?
Payoffs to transfer students. And what about the students with nondistinguished records from lower ranked schools who incurred huge debt for a elite credential in order to subsidize a high LSAT 1L--are these transfer students, with a paucity of callbacks and no 1L peer relationships, clearly better off? See 2005 LSSSE Report (showing heightened social isolation for transfer students)
Externalities on legal education. Can we all agree that any underlying increase in U.S. News rank has zero substance? If so, should the ABA crack down on law schools that divert a lot of resources toward recruiting 2L students for no other purpose than boosting entering credentials of 1Ls?
Without further ado, here is the graph for transfers (Tiers 1-2), ordered by 2007 U.S. News rankings and normalized as a percentage of each school's graduating class [click-on to enlarge]. In Tier 1, Georgetown (+14%) and Washington University (+18%) have the largest net inflow of transfer students. In Tier 2, the big net gainers are Florida State (+20%) and Rutgers-Camden (+16%). The graph for Tiers 3-4, in many respects the mirror opposite of Tiers 1-2, is after the jump.
I am working on a series of interrelated projects on law firm economics. In my research, I recently ran across an interesting NALP data release on median associate salaries by year and firm size (2007 income). To visualize the data, I generated the trend chart below [click to enlarge]. As the saying goes, a picture is worth a 1,000 words.
The median salary spread for a 1st year associate at a 2 -25 lawyer firm versus
a 500+ firm is $77,000. When the reference group is a 51-100 lawyer law firm versus
500+, the differential is still a substantial $55,000 per year. Cumulatively, for all eight years of the associate track, the spread amounts to $631,000 for 2-25 vs. 500+ lawyer firms and $524,000 for 51-100 vs. 500+. In other words, the bimodal distribution discussed in an earlier post appears to hold fairly steady during the first several years of a young associate's career. In fact, most 8th year associates at firms smaller than 250 lawyers are making less than a 1st year associate at a 500+ lawyer firm.
Before readers (especially students) anchor on the dollars, it is worth asking whether the smaller paychecks translate into substantially fewer hours and a better work/life balance. There are two fairly good sources of data to address this question: (1) the After the JD Project, and (2) an ABA Young Lawyers Division Survey. Both of these datasets were compiled with the help of researchers from the American Bar Foundation. A related resource is the Young Associates in Trouble essay/study, which I c0-authored last year with David Zaring (Wharton). One of our central insights, amply supported by data, was the large variation in work environments among large law firms.
Graphics from these studies appear after the jump.
With support from the National Science Foundation, Jeff Segal, Harold Spaeth, and Lee Epstein have just
rolled out the Digital Archive of the Papers of Harry A. Blackmun. In its
present form, the Archive houses .pdfs of the docket sheets and preliminary
(pool) memoranda from Blackmun's years on the Rehnquist Court (1986-1993). We
hope its contents will interest a wide range of users, from those looking for
the votes (and clerk recommendations) in specific cases to those hoping to study
trends in the Court's case-selection and decision-making processes. The Archive's homepage is here.
To go directly to the Archive, use this link: http://epstein.law.northwestern.edu/research/BlackmunArchive/
Creating the Archive required Segal, Spaeth, and Epstein to work with hundreds of thousands of digital images. Therefore, as users spot problems, please send an email to Lee Epstein (email@example.com). The researchers will work to quickly correct errors as they become known. Finally, Jeff, Harold, and Lee realize that the Archive works (and looks) better
in some browsers than others. They are hoping to fix the problem soon.
Every since graduate school when a member of my dissertation committee lit into a draft paper of mine I developed a healthy pet peeve about the word "data" (the plural form of the singular "datum" -- or so I thought). Over at the Social Science Statistics blog Mike Kellermann's recent post gigs into the usage wars waged over data and uncovers some ambiguity.
Mike appears to throw his lot behind the "data is a singular because it is increasingly view as a 'mass noun'" camp. For me, however, Strunk & White's classic and time-tested The Elements of Style breaks any usage "ties." Regrettably, even here, while the tie is broken, a hint of change exists. "Data. Like strata, phenomena, and
media, data is a plural and is best used with a plural verb. The word,
however, is slowly gaining acceptance as a singular."
In my last post, I had a throw away line on Columbia law professor Michael Dorf. For those following the link I provided, the clear implication was that Dorf's comments on the Chereminsky affair -- e.g., "[e]ven solid but middling-ranked law schools can have at best a marginal impact on the course of legal education" -- was evidence of Dorf's overconfidence in his own judgment.
Dorf picked up on the Moneyball theme in the comments section, claiming that he in no way endorsed the data-blind approaches of traditional baseball scouts. Professor Dorf then extended the baseball analogy to buttress his underlying point:
“[A] discerning reader will see [from Dorf's original post and subsequent qualifying comments] that I was making a quite
different point: I was guessing [to some of us, it looked like concluding] that
elite law schools could do a middling job at actually educating students and
still not suffer serious reputational hits, because their students' talents
cover up for this. (This is why the Yankees have been a good team even though
they didn't follow the path of Billy Bean: their huge war chest covered up
their errors.) I don't deny that eventually good innovations win out but I say
that it will likely require a market leader to make the process work. Thus, the
Boston Red Sox have been more successful using Bean's strategy than the A's
were, because the Sox have done it with more money.
Sure, money / starting position matters. But the fact remains that virtually all major league clubs now employ the techniques originally created by a small market ("middling-ranked") team. Far from having a "marginal impact", these innovations revolutionized baseball.
Starting with a clean slate, could Chemerinsky do the same at UC Irvine? I suspect a large number of deans would love to play his hand: (1) the UC brand and faculty benefits, (2) a public tuition subsidy for students, (3) a spectacular geographic location adjacent to a vibrant legal market, (4) the ability to select an entire faculty based on a unique institutional vision, and (5) seed money from private donors.
In our empirical work on law school rankings (here and here), Andy Morriss and I found that (a) proximity to large legal markets and (b) low cost (e.g., public school or low avg debt) are both drivers of higher LSAT scores over time. UC Irvine has both of these advantages. Further, if UC Irvine can document that its innovative teaching adds more value in three years (a LONG time) than its tradition-bound competitors, then employers (and eventually students) will move the market. How elite is a law school when students and employers find a better value downstream?
At present, we do not know the potential benefits of innovative teaching methods on human capital--e.g., can the gains be permanent and persistent, trumping raw ability (as measured by LSAT scores, far from perfect) over time? This is an important empirical question. I know that some legal educators assume the answer is no. Because the stakes are so high, I would prefer to see the data.
If law schools -- during the course of three years -- can add substantial and persistent value beyond raw aptitude, the innovators (and I hope Chemerinsky is one of them) could get a huge competitive advantage, at least until the elite schools are pressured by market forces to learn and apply those same innovations. And this would be far from a "marginal impact".