Since the inception of the Law School Survey of Student Engagement (LSSSE) in 2003, Indiana Law has been a regular participant. For those readers unfamiliar with LSSSE, it is the law school version of the National Survey of Student Engagement, which has been widely adopted by undergraduate institutions. In the spring of each year, LSSSE sends out a questionnaire to students at participating law school to elicit information on their law school experiences, including:
- Classroom environment and interactions with faculty (20 variables)
- Self-reported gains on important constructs related to effective lawyering (5 variables)
- Type and volume of writing within law school (3 variables)
- Participation in extracurricular or c0-curricular activities (9 variables)
- Satisfaction with law school experience (7 variables)
- Time allocation during law school (e.g., studying for class, socializing, exercising, volunteering) (12 variables)
- Collegial and supportive atmosphere (3 variables)
- Self-reported gains on various occupational and interpersonal skills (16 variables)
- Total satisfaction with law school and willingness to attend the same law school again (2 variables)
- Demographic attributes, debt loads, law school grades, entering credentials and career goals, etc. (note that it is possible to match and add in additional information).
Last year, we did a trend analysis of our LSSSE data and found that we were making progress (or occasionally backsliding) on several key dimensions. But the trend analysis raised one crucial question: What was our baseline? For example, a slight decline in our mean score may not be cause for alarm if we are still outperforming peer schools on some dimension. Alternatively, an increase on another key dimension may not be good enough if we are still well below average vis-a-vis other law schools.
Fortunately, LSSSE supplies schools with this benchmarking data, including t-tests to flag statistically significant differences. In our case, there are four reference groups: the LSSSE sample as a whole (79 schools for 2007), other public law schools, other law schools with 500 to 900 student enrollment, and selected peers--in our case, six other schools in the USNWR Tier 1 (we know which schools, but their scores are aggregated).
Fortunately, this benchmarking data can be placed in a visual format that quickly conveys our law school's relative performance. The slide below is from a lunch time forum that was presented to our faculty last week. The methodology is very simple: Subtract the LSSSE mean from the mean of the four reference groups. In general, positive values ("above the line") show that the group is outperforming the LSSSE sample; negative values ("below the line") show underperformance. And asterisk (*) denotes a statistically significant difference between IU and the reference group.
Note that these charts were generated from actual Indiana data--albeit the construct labels are stripped off because our performance via-a-vis other law schools is proprietary. But I can say this: At Indiana, the glass is about 2/3 full. We truly excel on some key dimensions, and we can be proud of that fact. On other dimensions that should be hallmarks of a truly great law school, we are middle-of-the-pack. And on a handful, we have negative measures that warrant careful, targeted follow-up. Yet, our self-assessment is not merely a matter of opinion and anecdote--it is rooted in actual data, with benchmarks and trend lines draw from over 90 variables measured annually for five consecutive years.
After the jump, I explain why law professors and deans should care about LSSSE benchmarking--or, more accurately, the perils of dismissing it in favor of pet ideas that lack similar empirical support.
Over the years, I have learned a lot--or I think I have learned a lot--about presenting empirical data to legal audiences. Law professors are the most skeptical and argumentative lot imaginable. When empirical data is presented that challenges a well entrenched view, law professors query whether the sample or methodology can really be trusted--lack of statistical knowledge is rarely an impediment to this line of objection. This is a great mindset if we are engaged in a winner-take-all adversarial contest. But it not the right approach for building a great institution.
So when LSSSE benchmarks undermine a cherished self-perception about one's law school, should we defer to the data? Or should we assume that our students did not understand the question, or there was rampant selection bias? Here is are some relevant facts:
- 4 out of 5 law schools participating in LSSSE have response rates greater than 50%;
- Demographically, the samples are virtually always representative of the school as a whole;
- 1Ls are more likely to respond than 2Ls or 3Ls (at Indiana, 70% for 1Ls, 55% for 2Ls and 3Ls); fortunately, law school year is a variable in the dataset.
If a law school were a typical for-profit business, a CEO with standard MBA training would be ecstatic if over 50 percent of her customers--all college educated with relatively high analytical aptitudes--had taken 15 minutes of their time to evaluate their experience with the school/company during the last year. Even more potent is the availability of aggregate-level data for key industry rivals. This information enables the school/company to identify ways to be better/faster/stronger than rivals with comparable inputs. In terms of business management, this is all just Six Sigma, or TQM 101 -- i.e., an iterative data- and theory-driven focus on quality outputs.
For those law faculty who would dismiss such detailed market intel in favor of their own vision of a great law school, typically without any empirical data to assess actual progress, that path is fraught with problems. As the price of legal education rises faster than the earning power of most law school graduates, law school applicants are declining. Further, we can expect those students who do apply to be more discriminating consumers.
So why I am writing this blog post? The answer is simple. Legal education is not about turning a profit or maximizing prestige--to my mind, it is about educating highly competent, ethical lawyers who carry forward the highest ideals of the profession. So, as a moral imperative, this information needs to be shared (and my dean, Lauren Robel, agrees with this assessment).
Further, to reinforce the moral point, there are compelling institutional reasons for focusing on LSSSE data. Five years worth of data is a great window of experience. The only other law school with five years worth of LSSSE data is New York Law School. I encourage other law professors and administrators to read Dean Richard Matasar's preface to the 2007 LSSSE Annual Report, which discusses how NYLS used LSSSE data, in conjunction with innovative curricular changes, to dramatically improve the school's bar passage rate (this year, 1 point below Cornell)! If another law school has not yet participated in LSSSE, it is already years away from duplicating NYLS's successes.
Here at Indiana Law, we have a plan to systematically use LSSSE data to leverage our strengths and focus and correct our weaknesses. In the years to come, we want to our successes to snowball, thus establishing our distinctive brand through a relentless focus on substance rather than marketing hype. To give you a taste of what this data look like, here are some graphics drawn from Indiana data and posted with my dean's permission.
These strong scores are corroborated by comparable scores for our law school technology, library service, student counseling, academic counseling, and financial aid.
2. Career services is the Achilles Heal of virtually all non-elite law schools. Here at Indiana, this office has been a target for a lot of restructuring and additional resources during the last several years. And as the below chart illustrates, our trend lines are moving in the right direction:
3. In terms of classroom dynamics, we are pleased that our students, relative to other law schools, tend to come to class prepared. This is a dynamic that is partially a function of being a Tier 1 public law schools--our students move to Bloomington and don't have part-time jobs. (Indeed, the LSSSE data shows very low commute times for our students.) But this higher modal level of preparation improves the classroom atmosphere for everyone.
I am happy to talk more about LSSSE with faculty at other law schools. (Disclosure: I am a research associate with LSSSE, though I get no financial or research support through this affiliation.) If you want information on participating in the LSSSE survey, contact the LSSSE Project Manager, Lindsay Watkins.