Since the inception of the Law School Survey of Student Engagement (LSSSE) in 2003, Indiana Law has been a regular participant. For those readers unfamiliar with LSSSE, it is the law school version of the National Survey of Student Engagement, which has been widely adopted by undergraduate institutions. In the spring of each year, LSSSE sends out a questionnaire to students at participating law school to elicit information on their law school experiences, including:
- Classroom environment and interactions with faculty (20 variables)
- Self-reported gains on important constructs related to effective lawyering (5 variables)
- Type and volume of writing within law school (3 variables)
- Participation in extracurricular or c0-curricular activities (9 variables)
- Satisfaction with law school experience (7 variables)
- Time allocation during law school (e.g., studying for class, socializing, exercising, volunteering) (12 variables)
- Collegial and supportive atmosphere (3 variables)
- Self-reported gains on various occupational and interpersonal skills (16 variables)
- Total satisfaction with law school and willingness to attend the same law school again (2 variables)
- Demographic attributes, debt loads, law school grades, entering credentials and career goals, etc. (note that it is possible to match and add in additional information).
Last year, we did a trend analysis of our LSSSE data and found that we were making progress (or occasionally backsliding) on several key dimensions. But the trend analysis raised one crucial question: What was our baseline? For example, a slight decline in our mean score may not be cause for alarm if we are still outperforming peer schools on some dimension. Alternatively, an increase on another key dimension may not be good enough if we are still well below average vis-a-vis other law schools.
Fortunately, LSSSE supplies schools with this benchmarking data, including t-tests to flag statistically significant differences. In our case, there are four reference groups: the LSSSE sample as a whole (79 schools for 2007), other public law schools, other law schools with 500 to 900 student enrollment, and selected peers--in our case, six other schools in the USNWR Tier 1 (we know which schools, but their scores are aggregated).
Fortunately, this benchmarking data can be placed in a visual format that quickly conveys our law school's relative performance. The slide below is from a lunch time forum that was presented to our faculty last week. The methodology is very simple: Subtract the LSSSE mean from the mean of the four reference groups. In general, positive values ("above the line") show that the group is outperforming the LSSSE sample; negative values ("below the line") show underperformance. And asterisk (*) denotes a statistically significant difference between IU and the reference group.
Note that these charts were generated from actual Indiana data--albeit the construct labels are stripped off because our performance via-a-vis other law schools is proprietary. But I can say this: At Indiana, the glass is about 2/3 full. We truly excel on some key dimensions, and we can be proud of that fact. On other dimensions that should be hallmarks of a truly great law school, we are middle-of-the-pack. And on a handful, we have negative measures that warrant careful, targeted follow-up. Yet, our self-assessment is not merely a matter of opinion and anecdote--it is rooted in actual data, with benchmarks and trend lines draw from over 90 variables measured annually for five consecutive years.
After the jump, I explain why law professors and deans should care about LSSSE benchmarking--or, more accurately, the perils of dismissing it in favor of pet ideas that lack similar empirical support.
Over the years, I have learned a lot--or I think I have learned a lot--about presenting empirical data to legal audiences. Law professors are the most skeptical and argumentative lot imaginable. When empirical data is presented that challenges a well entrenched view, law professors query whether the sample or methodology can really be trusted--lack of statistical knowledge is rarely an impediment to this line of objection. This is a great mindset if we are engaged in a winner-take-all adversarial contest. But it not the right approach for building a great institution.
So when LSSSE benchmarks undermine a cherished self-perception about one's law school, should we defer to the data? Or should we assume that our students did not understand the question, or there was rampant selection bias? Here is are some relevant facts:
- 4 out of 5 law schools participating in LSSSE have response rates greater than 50%;
- Demographically, the samples are virtually always representative of the school as a whole;
- 1Ls are more likely to respond than 2Ls or 3Ls (at Indiana, 70% for 1Ls, 55% for 2Ls and 3Ls); fortunately, law school year is a variable in the dataset.
If a law school were a typical for-profit business, a CEO with standard MBA training would be ecstatic if over 50 percent of her customers--all college educated with relatively high analytical aptitudes--had taken 15 minutes of their time to evaluate their experience with the school/company during the last year. Even more potent is the availability of aggregate-level data for key industry rivals. This information enables the school/company to identify ways to be better/faster/stronger than rivals with comparable inputs. In terms of business management, this is all just Six Sigma, or TQM 101 -- i.e., an iterative data- and theory-driven focus on quality outputs.
For those law faculty who would dismiss such detailed market intel in favor of their own vision of a great law school, typically without any empirical data to assess actual progress, that path is fraught with problems. As the price of legal education rises faster than the earning power of most law school graduates, law school applicants are declining. Further, we can expect those students who do apply to be more discriminating consumers.
So why I am writing this blog post? The answer is simple. Legal education is not about turning a profit or maximizing prestige--to my mind, it is about educating highly competent, ethical lawyers who carry forward the highest ideals of the profession. So, as a moral imperative, this information needs to be shared (and my dean, Lauren Robel, agrees with this assessment).
Further, to reinforce the moral point, there are compelling institutional reasons for focusing on LSSSE data. Five years worth of data is a great window of experience. The only other law school with five years worth of LSSSE data is New York Law School. I encourage other law professors and administrators to read Dean Richard Matasar's preface to the 2007 LSSSE Annual Report, which discusses how NYLS used LSSSE data, in conjunction with innovative curricular changes, to dramatically improve the school's bar passage rate (this year, 1 point below Cornell)! If another law school has not yet participated in LSSSE, it is already years away from duplicating NYLS's successes.
Here at Indiana Law, we have a plan to systematically use LSSSE data to leverage our strengths and focus and correct our weaknesses. In the years to come, we want to our successes to snowball, thus establishing our distinctive brand through a relentless focus on substance rather than marketing hype. To give you a taste of what this data look like, here are some graphics drawn from Indiana data and posted with my dean's permission.
1. The chart below shows that Indiana law excels in its administrative and support services.
These strong scores are corroborated by comparable scores for our law school technology, library service, student counseling, academic counseling, and financial aid.
2. Career services is the Achilles Heal of virtually all non-elite law schools. Here at Indiana, this office has been a target for a lot of restructuring and additional resources during the last several years. And as the below chart illustrates, our trend lines are moving in the right direction:
3. In terms of classroom dynamics, we are pleased that our students, relative to other law schools, tend to come to class prepared. This is a dynamic that is partially a function of being a Tier 1 public law schools--our students move to Bloomington and don't have part-time jobs. (Indeed, the LSSSE data shows very low commute times for our students.) But this higher modal level of preparation improves the classroom atmosphere for everyone.
I am happy to talk more about LSSSE with faculty at other law schools. (Disclosure: I am a research associate with LSSSE, though I get no financial or research support through this affiliation.) If you want information on participating in the LSSSE survey, contact the LSSSE Project Manager, Lindsay Watkins.
The United States Constitution allocates power between these two levels of government.
Posted by: tadalafil | 20 April 2010 at 04:17 PM
Hey guys I really like this post about Benchmarking Law School Performance: Why Law Professors and Deans Should Care!!!! Keep going your good job!!!
Posted by: buy propecia | 26 January 2010 at 08:14 AM
Ted,
I concede your first point, but don't understand your second one.
1) When I wrote "proprietary" I was thinking about the contractual provision between LSSSE and the law school that LSSSE will not reveal a school's score without the school's consent. But I suspect that that LSSSE would be subject to a Freedom of Information Act (FOIA) request and/or the state law analogue. If someone systematically collects public law school's LSSSE data and publishes it, it provides less incentive to participate. One solution is to make LSSSE mandatory and transparent as a condition of accreditation. I would be all for that.
2) What empirical work have I "published" in relate to LSSSE? The point of the post is to encourage schools to use LSSSE data to analyze their schools internally, not to showcase Indiana's LSSSE scores -- indeed, I point out that they are low on some dimensions. The aggregate level data is calculated by LSSSE, including the t-test results. I generated the differences (displayed on the bar charts) using an excel spreadsheet, with my assistant plugging in the data. But again, that is just to generate visual graphs. Getting the data that generated the t-test results is impossible (as a practical matter) because LSSSE only has the right, under a contract with 80+ institutions) to release aggregated results.
Re your point on "making the underlying database publicly available." I think you are painting with too big a brush. I know lots of PhDs who are willing to share their data ... after they are done with it. There needs to be some incentive to create original data, which comes in the form of the right to exclude. Further, some of the best data available these days is (really) proprietary because it was created by private industry; but for a licensing agreement, no one would have access to it.
In one of the licensing agreements I work under, there is a provision that permits making data available to journal editors to re-run regressions (e.g., using Stata ado files). The burden is on us to negotiate with limited use with journal editors. But sharing the dataset with a third party is forbidden. Another possibility is posting regression results on the Internet; everyone can look at multiple specifications.
The flipside is externally funded datasets; if someone gets a grant from the NSF, then making it publicly available is a condition of the fundings. The quid pro quo is that others don't get access until you publish the first paper and you are compensated for the time and effort to create the resource.
Licensing issues and incentive structures make it hard, in my opinion, to support a bright line rule for posting data. bh.
Posted by: Bill Henderson | 06 February 2008 at 01:49 PM
I am intrigued by the premise that the performance of a state-subsidized school is proprietary information.
I am also intrigued by the premise that it is credible to publish empirical work without making the underlying database publicly available. This is not so in any other field. The possibility of replication is essential to credibility.
Posted by: Theodore Seto | 06 February 2008 at 11:18 AM
"When empirical data is presented that challenges a well entrenched view, law professors query whether the sample or methodology can really be trusted--lack of statistical knowledge is rarely an impediment to this line of objection. This is a great mindset if we are engaged in a winner-take-all adversarial contest. But it not the right approach for building a great institution."
Nor, I would add, for intellectual inquiry, at least not that of an empirical sort. But since I've blogged on this before (at http://tinyurl.com/39xy7v ) I'll get off my soapbox.
Posted by: Christopher Zorn | 25 January 2008 at 09:39 AM