What Law Schools Should Tell Applicants
In a previous post, I criticized the approach the ABA has proposed for including bar passage rates in the accreditation process. This post considers the same question from the standpoint not of accreditation but of the information law schools ought to provide to applicants.
Most law students study law in order to practice it. For most of them, practice requires passing the bar. Naturally enough, they want information on how likely they are to pass according to what school they go to. There is currently no way they can get that information.
To see the problem, imagine two law schools: Harford and Podunk. Harford, having its choice of students, admits ninety with LSAT scores of 180, the highest possible, and ten with LSAT scores of 160—a mix of affirmative action admissions, children of generous donors, and students admitted due to a bug in the admissions office software. Podunk admits ninety students with LSAT scores of 160, ten with scores of 180—students rejected by Harford due to the same bug.
Both classes graduate and take the bar. Harford reports a bar bassage rate of 80%, Podunk of 50%. Which is the better school to go to if you want to pass the bar?
Of the eighty Harford students who passed the bar, seventy-eight arrived with an LSAT score of 180, two with 160. Of the fifty Podunk students who passed, ten were students with 180 LSATs, forty were students with 160 . The bar passage rate for the low LSAT group was 20% for Harford, 44% for Podunk. For the high LSAT group it was 87% for Harford, 100% for Podunk. For both groups, Podunk did better.
Bar passage depends both on the school and on the student is; the average bar passage rate for the school, which is what gets published, shows the combined effect. So a school that admits better students may get a higher bar passage rate even if it does a worse job of teaching them.
To provide applicants the information they want, schools need to publish bar passage rates as a function of LSAT. A simple way of doing so would be to break LSAT scores into groups—176-180, 171-175, ...—and report bar passage rates for each group, perhaps summed over a period of two or three years to provide enough data for a meaningful figure. It might turn out that the elite schools did a worse job for everyone. More plausibly, it might well turn out that the elite schools did a better job for students with high LSATs and a worse job for students with low LSATs—useful information for the latter in deciding where to go.
[I am not at this point concerned with whether students ought to judge schools by bar passage rates, only in how we can help them to do it.]