Dispelling the myths of lawyer education

There’s an old story about a supposed experiment in which five apes are placed in a cage containing a stepladder. A banana is hanging from the roof of the cage, and a sprinkler with ice-cold water is positioned above it. Whenever an ape tries to climb the ladder to get the banana, the sprinkler comes on and drenches all the apes until the ambitious ape abandons the effort. Eventually, after numerous attempts and soakings, the apes learn to avoid the ladder altogether. Then the sprinkler is turned off completely.

Now one of the apes is replaced with a new ape, who, not surprisingly, heads straight for the stepladder to get the banana. The other apes set upon him immediately, beating and shoving him until he gives up — even though the water never comes on. Then another replacement ape arrives, and when he tries to get the banana, the other apes attack him — including the previous new ape who has never been soaked! Eventually, five new apes who’ve never been showered with ice water will nonetheless avoid the stepladder and the banana. And that, the story goes, is where policy comes from — that’s the way we’ve always done it around here.

The legal profession resembles that cage in a lot of ways, but how we educate and recruit new lawyers might be the best example. Our beliefs and practices about the legal training process owe far more to our professional myths and oral traditions than they do to the cold light of evidence. Here are two recent examples.

Leah Christensen of the Thomas Jefferson School of Law has just published a paper titled “Predicting Law School Success: A Study of Goal Orientations, Academic Achievement, and the Declining Self-Efficacy of Our Law Students.” (HT to Douglas Berman at the Law School Innovation blog.) The paper focuses largely on goal orientation and student self-efficacy, two concepts I’m not sure I can fully grasp on just one cup of coffee, but the key finding for me was this:

[T]he LSAT score was the weakest predictor of law school success. … The strongest predictor of success was between Lawyering Skills grade and class rank (0.57). There was a moderate positive correlation between UGPA and class rank (0.46). And, there was a weak correlation between LSAT score and class rank (0.23). According to the results of this study, Lawyering Skills Grade is a better predictor of class rank than the LSAT. Further, undergraduate performance, rather than a student’s performance on the LSAT, is a stronger predictor of law school performance.

As Christensen points out, this is just the latest data point to support the increasingly obvious fact that the LSAT’s importance in the law school admission process is wildly disproportionate to its usefulness. Yet law schools continue to over-rely on it, and the infamous US News & World Report US law school rankings continue to over-emphasize it. Incoming students’ shoe size is probably about as reliable a predictor of law school success as the LSAT — yet when a school considers even tweaking its approach to the LSAT, all hell breaks loose.

On the heels of Christensen’s paper comes another from Benjamin Barton of the University of Tennessee College of Law at Knoxville, whose published work asks: “Is There a Correlation Between Law Professor Publication Counts, Law Review Citation Counts, and Teaching Evaluations?” (HT this time to the TaxProf Blog, via Best Practices for Legal Education.) Barton’s comprehensive study calls into serious question law schools’ penchant for hiring professors whose skills lie in publishing rather than teaching:

[It] covers every tenured or tenure-track faculty member at 19 U.S. law schools, a total of 623 professors. The study gathers four years of teaching evaluation data (calendar years 2000–2003) and tests for associations between the teaching data and five different measures of research productivity/scholarly influence.

The results are counterintuitive: there is either no correlation or a slight positive correlation between teaching effectiveness and any of the five measures of research productivity. Given the breadth of the study, this finding is quite robust. These findings should help inform debates about teaching and scholarship among law school and other faculties and likely require some soul-searching about the interaction between the two most important functions of U.S. law schools.

Soul-searching might be called for, but are we likely to see it happen? Law school culture has long valued profs’ academic reputation and publication credentials well beyond their worth to law students. With a few honourable exceptions, schools have shown little interest in changing that.

But let’s not imagine that legal academia has cornerned the market on new lawyer myths. Law firms continue to over-value new graduates’ law school transcripts when selecting and hiring new recruits. As I noted earlier this year, when one law firm sat down to determine the correlation between its new lawyers’ academic achievement and their success as working lawyers, it found there wasn’t one:

The firm compared each of its associates’ grades, class rank, and school rank to their evaluations and accomplishments at the firm. Blackwell found that neither law school rank nor class rank could determine who would become a standout lawyer.

That firm, now Husch Blackwell Sanders, went on to place what it learned from this and similar studies into a handbook titled From Classes to Competencies, Lockstep To Levels, which documents the transformation of its entire associate culture in light of reasoned empirical research.

So here’s what we have: evidence, often compelling, that:

  1. LSAT scores don’t tell you much about whether someone will be a good law student,
  2. Publishing credentials don’t tell you anything about whether someone will be a good law professor, and
  3. Law school marks don’t tell you anything about whether someone will be a good lawyer.

And yet LSAT scores, law professor credentials and law school marks remain the three most significant criteria employed within the lawyer training system. Apes in a cage.

Your law school and your law firm don’t need to hunker down by the fire and retell the same old legends about how to train and recruit lawyers. It hardly bears asserting that the weight of longstanding and widespread practice shouldn’t be enough to keep you from revisiting your assumptions and getting it right —  to your competitive advantage. Or, as one of my favourite Despair.com posters puts it: “Just because you’ve always done it that way doesn’t mean it’s not incredibly stupid.”



3 Comments

  1. Anne

    This has to be one of the more depressing articles I’ve read recently — not yours, Jordan, but Barton’s. It’s truly disheartening to discover that the connection between research and teaching appears to be totally chimerical. And worse, the findings by Barton (and Marsh and Hattie) support both academic administration’s contention that we’re teaching about as well as we ever will, so that scholarship in teaching is a big fat waste of time, and academic teachers’ contention that if they’d only give us a bit of a break from teaching we’d be much more productive researchers, regardless of how well (or ill) we’re teaching. The prospect that teaching and research really are “same planet, different worlds” offers a challenge that goes well beyond law schools; every institution of higher learning should confront these findings, follow them up, and decide what they mean for the future of university education. The temptation will be to continue the balkanisation of the university — let the teachers teach, the researchers research (and bring in the big money), and the administrators confirm the absolute division between them (with the rewards and perks going to the researchers, of course, while the teachers slog away in the trenches). But a finding of negative correlation between “time spent on teaching” and “improvement in teaching evaluations” also suggests that those of us who value teaching can relax, at least a little, and let our teaching personalities find their own level, turn our attention away from the in loco parentis tendency, and focus on what makes our working lives meaningful … and bearable. Still, knowing how these findings could be used, and feeling perhaps that all the time I spend on my students may have minimal impact, is a bitter pill to swallow half way through the term!

  2. Doug Jasinski

    Very topical post – I came across another article on the ABA Journal website this morning about law school rank and GPA not being very good indicators of success at biglaw. Instead, the study identified 12 other factors that are better predictors of success. They play a bit of cat-and-mouse with what those 12 factors are but one is participation in group hobbies and collegiate level athletics. athletics.http://www.abajournal.com/weekly/school_rank_and_gpa_arent_the_best_predictors_of_biglaw_success
    It’s interesting fodder for anyone involved in lawyer and law student recruitment.

  3. Carolyn Elefant

    Jordan,
    While I agree 100 percent that law schools and firms misplace priorities, the truth is that many law firms do place much more emphasis on class rank than grades. That is why firms will often hire the top 4-5 people from even the so-called 2nd and 3rd tier law schools. And on a local level, firms will give preference to the top 5 percent of local grads over middle of the road grads at top schools. My point is that even this study suggests that class rank is an indicator of performance and thus, in some respects, it does support a conclusion opposite to what some have gleaned from it, i.e., that grades don’t matter.
    Second, though this is anectodal, I should also add that when I graduated college, one of the top 10 law schools definitely favored grades over LSAT scores. Two of my college classmates had rather mediocre LSAT scores, well below the reported average for that school,. However, both were summa grads, and ranked #1 in their respective departments and were both accepted (and went on to successful albeit conventional legal careers). So I think that there are some law schools that recognize do acknowlege that LSATs are pretty worthless.


Leave a reply