An exceptionally helpful source of data for those interested in US Supreme Court decisions was recently updated to include data from OT2011. The Supreme Court Database (2012 release, v.01, here) "contains over two hundred pieces of information about each case decided by the Court between the 19 and 20 terms. Examples include the identity of the court whose decision the Supreme Court reviewed, the parties to the suit, the legal provisions considered in the case, and the votes of the Justices." An online codebook for this leading compilation of Supreme Court decisions (particularly for political scientists) can be found here.
Many--but not all--data sets analyzed in Stata began as Excel spreadsheets. To get to Stata's analytics requires importing Excel files. All too often, what should be a straight-forward task becomes anything but. A recent post on the Stata Blog walks readers through (with helpful screen-shot pictures) the data import process.
The "import excel" command in Stata 12 can help import Excel files into Stata by addressing two specific issues. First, Excel files often contain header and footer information in the first few and last few rows of a sheet, and you may not want that information loaded. A second common problem is that many column labels used in Excel spreadsheet files are invalid as Stata variable names (and therefore cannot be loaded into Stata).
As I am partial to elegance and subtly when it comes to graphs, tables, etc., the overly-complicated graphics command structures in many of the leading software packages can quickly frustrate even experienced users. Moreover, transforming graphical output into useful forms, such as PowerPoint slides, generates its own set of challenges.
Thus, I was delighted to find over at the Social Science Statistics Blog a link to a wonderfully helpful slideshow by Matthew Sigal (York), replete with practical tips on how to post-process graphs with Adobe Illustrator. Though the examples make specific reference to graphs produced in R, the advice is easily translated into other statistical software packages. Finally, while York's main point is to assist with the concrete task of enlisting Adobe into the service of better graphics, one implicit and broader point involves the importance of attending to the visual display of data.
Late last night, on a nearly party-line 218-208 vote, the U.S. House passed an amendment (by Rep. Flake, R-AZ) to HR 5326 to "prohibit the use of funds to be used to carry out the functions of the Political Science Program in the Division of Social and Economic Sciences of the Directorate for Social, Behavioral, and Economic Sciences of the National Science Foundation." The Monkey Cage has some of the relevant links. Efforts like this have been mounted before -- most recently in 2009, by Sen. Tom Coburn -- but none have gotten this far.
The actual debate on the defunding amendment (all five minutes of it!) is here, in the CR. I am not enough of a student of the appropriations process (or of legislative politics in general) to speculate on what might happen next. But I do think that if I were Subra Suresh, Myron Gutmann, or the SBE Advisory Committee -- or, for that matter, the directors of the NIJ, any of the NIH agencies, etc. -- I would be very concerned about the precedent that this would set. For Congress to begin micromanaging the NSF at the program level raises some serious concerns about the politicization of science.
One of my honors advisees this term has just completed her thesis, an empirical examination of the ACLU's ability to influence the U.S. Supreme Court through filing amicus curiae briefs during the Burger and Rehnquist Courts. It's a careful study, combining qualitative and quantitative methods. The thesis has a number of interesting findings; I won't go into all the details, but I wanted to share this:
It's a plot of the odds ratio for the presence of an ACLU brief (from a series of justice-specific logistic regressions that also included a host of controls) where the outcome variable is a pro-ACLU vote. The vertical line at 1.0 corresponds to "no marginal association;" values less than one are negative associations, while values above one are positive. (I'm omitting the CIs because they make the plot very messy.)
The interesting thing, of course, is that justices widely perceived as more politically liberal also tend to be those that are most "influenced" by the ACLU's amicus briefs.
Also, to all those NYC firms out there: My advisee, Shannon Azzaro, will be starting law school at Fordham in the fall, with a focus on intellectual property. I understand that her 2013 summer is still free...
Ronen Avraham (Texas) recently alerted me to an update of one of the more helpful resources around for those who study tort reform. Specifically, Database of State Tort Law Reforms (DSTLR 4th) seeks to create "one 'canonized' dataset will increase our understanding of tort reform’s impacts on our lives." A fuller (though excerpted) description follows.
"This manuscript of the DSTLR (4th) updates the DSTLR (3rd) and contains the most detailed, complete and comprehensive legal dataset of the most prevalent tort reforms in the United States between 1980 and 2010.... The dataset records state laws in all fifty states and the District of Columbia over the last several decades. For each reform we record the effective date, a short description of the reform, whether or not the jury is allowed to know about the reform, whether the reform was upheld or struck down by the states’ courts, as well as whether it was amended by the state legislator."
Following up on an initial post (previously described here), the Stata blog has a second post on data set merging tips. The second post focuses on multiple key data set merges. Multiple-key merges are required when more than one variable is needed to uniquely identify observations in your data. This situation frequently arises in panel or longitudinal data sets.
While I take no side (or sides) in the proverbial "Stata v. SPSS v. R" wars (I use all three), I want to note Stata's recent venture into the Blog universe. Though it's far too early to see if the Stata blog will succeed, it's an interesting development nonetheless. A recent post provides a helpful step-by-step guide to one of the more common (yet error-prone) data tasks: merging data sets. The explanation is helpful and includes all too common pitfalls.
Yesterday, Bob Morse of U.S. News published a blog post in which he signaled a change in the law school rankings methodology, specifically with regard to employment. The prevailing view on the law school administrator list-serves (which nearly a dozen people have forwarded to us) is that U.S. News will be increasing the weighting of "employed at graduation," presumably because U.S. News Editor Brian Kelly sent a letter to law school deans--reprinted in Bob's blog post--discussing the importance of employed-at-graduation as a metric.
We have zero inside information, but we are willing to bet a substantial sum that any methodology change will be in a completely different direction. Here is why. Over the last decade, fewer and fewer schools have been supplying U.S. News with employed-at-graduation data. Employment at graduation is not a statistic required or collected by the ABA; as such, its accuracy cannot be checked through cross-reference to the annual ABA-LSAC Official Guide.
But much more significantly, when a school fails to provide this data, U.S. News has--up until now--imputed the figure based on employment at 9 months. (Kudos to Ted Seto for unraveling this mystery. See Understanding the U.S. News Law School Rankings.) Crudely speaking, the magazine applied a roughly 30% discount rate on the employed at 9 months figure. Earlier this year, Paul Caron suggested that if a school's employed-at-graduation rate is more than 30% lower than its employed at 9 month rate, it is "rankings malpractice" to supply U.S. News with the data.
As readers can see from the above chart (generated by Paul Caron in his rankings malpractice post), a large proportion of law schools have figured out the payoffs. Over the last decade, the percentage of non-reporting schools has skyrocketed. With this information in mind, Bob Morse's blog post may seem less cryptic:
In an effort to make our law school employment data more reflective of the current state of legal employment, U.S. News has modified how we calculate the employment rates that are used in the new law school rankings. ...
U.S. News agrees with the efforts of Law School Transparency to improve employment information from law schools and make the data more widely available.
If the goal is (a) to utilize data that better reflect reality, and (b) provide greater transparency and access to such data, it makes no sense to increase the weight of an input (employed-at-graduation) that is either withheld by law schools or is heavily gamed. This latter point is made by U.S. News Editor Brian Kelly in his letter to the deans:
[E]mployment after graduation is relevant data that prospective students and other consumers should be entitled to. Many graduate business schools are meticulous about collecting such data, even having it audited. The entire law school sector is perceived to be less than candid because it does not pursue a similar, disciplined approach to data collection and reporting.
At U.S. News, we work to make meaningful and fair comparisons, based on industry-accepted data. ...
To eliminate some of the gaming that seems to be taking place, we have changed the way we compute employment rates for the rankings due out March 15. In addition, we will also be publishing more career data than we have in the past in an effort to help students more completely understand the current state of legal employment. We think more still needs to be done.
So Kelly is saying that employed at graduation data are important, and the magazine is tired of being gamed. Therefore, we think two methodogical changes have a good chance of being implemented:
U.S. News is likely to heavily penalized schools that withhold the employed at graduation data. Going forward, the imputation may be far more negative than -30% off of employed at 9 months. A drop in rankings will stop in its tracks the non-response problem.
Regarding perceptions of gaming, it is possible that U.S. News has formulated a way to quantify how many jobs at graduation map onto full-time professional jobs that require a law degree. For example, the ABA Official Guide provides lots of comparable data by practice setting. Law firms, judicial clerks, and government jobs could be weighted more heavily than business or academic jobs. Unknown may also be treated as 100% unemployed rather than the current 25% presumption of employment. Such changes would have the law schools scrambling to report better numbers in higher weighted categories rather than just finding ways to goose up the employed-at-graduation and employed-at-9 months figures. Remember that Bob Morse explicitly endorsed the Law School Transparency movement.
We would like to suggest to our colleagues in the legal academy that we are approaching an endgame. Here is the reality: prospective students are not being given an accurate picture of their future employment prospects. Why? Because we are all focused on filling next year's class with as many high credential students as possible, thereby protecting our school's place in the pecking order. Our focus is so shockingly narrow that, from the outside looking in, it appears that our intent is to deceive incoming students. Brian Kelly's letter to the deans essentially makes that point--law schools fall short on candor and ethical behavior.
The numbers that get submitted to U.S. News include many graduates who are technically employed but often significantly underemployed, often at positions that don't require law degrees. Finer grained data get reported to NALP, but they are never published on a school-by-school basis. If these data were released, prospective students may not fully process the information--that is an argument that we often hear law professors make. But that does not alter our duty to provide "basic consumer information ... published in a fair and accurate manner reflective of actual practice." ABA Accreditation Standard 509.
At some point, all our lawyerly rationalizations will come to a bad end because a governmental agency or a court is going to challenge our right to self-regulation, thus ushering in a truly disgraceful chapter in the history of American legal education.
Now is one of the very few moments in our careers as academics where we have to make hard choices and demonstrate that we warrant the trust and respect of our tenured positions. Through our governance organizations (ABA, LSAC, NALP, AALS), we need to implement a system of complete transparency on employment outcomes. If the system has real teeth, it will force us all to work very hard to ensure we are delivering value commensurate with the tuition dollars we collect.
It's the end of the road. We likely have one last chance to get it right.
Posting Andrew's call regarding his course on empirical legal scholarship made me wonder whether our readers are aware of his Judicial Elections Data Initiative? If not, take a look. This is GOOD stuff.
Over at Balkinization Brian Tamanaha (Wash U) posted graphs endeavoring to describe law student enrollment and BLS data on legal employment trends (2001-09). Visually, the two graphs support Brian's assessment: "Law schools thus responded to the worst recession in the legal market in at least two decades by letting in more law students." As Brian Leiter (Chicago) notes, however, "One can't tell, though, from the second chart [and the underlying BLS data] what portion of the downturn in 'legal employment' is a reduction in the employment of attorneys as opposed to other law-related employees."
In Child Support Guidelines and Divorce Rates, Margaret Brinig (Notre Dame) and Douglas Allen (Simon Fraser) draw on a large dataset (National Longitudinal Survey of Youth (NLSY)) to assess how variations in child support guidelines influence decisions to divorce. The paper's abstract follows.
child support guideline is a formula used to calculate support payments
based on a few family characteristics. Guidelines began replacing court
awarded support payments in the late 1970s and early 1980s, and were
later mandated by the federal government in 1988. Two fundamentally
different types of guidelines are used: percentage of obligor income,
and income shares models. This paper explores the incentives to divorce
under the two schemes, and uses the NLSY data set to test the key
predictions. We find that percentage of obligor income models are
destabilizing for families with high incomes. This may explain why
several states have converted from obligor to income share models, and
it provides a subtle lesson to the no-fault divorce debate."
ICPSR has just made available Wave 1 of the After the JD database. From the study description:
The After the JD project is designed to be a longitudinal study, seeking
to follow a sample of approximately 10 percent of all the individuals
who became lawyers in the year of 2000. It is the largest and most
ambitious study ever undertaken by researchers of legal careers aiming
to track the professional lives of more than 5,000 lawyers during their
first 10 years after law school.
Ronen Avraham (Texas) has updated a tremendously useful resource for those assessing the impacts of various tort reforms. In the Database of State Tort Law Reforms (3rd), Ronen sets out to establish "one 'canonized' dataset will increase our understanding of tort reform’s impacts on our lives." Direct access to the dataset (in Excel) is found here. A more complete description of the project follows:
"... DSTLR (3rd) updates the DSTLR (2nd) and contains the most detailed, complete and comprehensive legal dataset of the most prevalent tort reforms in the United States between 1980 and 2008. ... The dataset records state laws in all fifty states and the District of Columbia over the last several decades. For each reform we record the effective date, a short description of the reform, whether or not the jury is allowed to know about the reform, whether the reform was upheld or struck down by the states’ courts, as well as whether it was amended by the state legislator."