Crowdsourcing legal research

A terrific discussion is underway at SLAW, prompted by news of a new Canadian online research service, about the future of commercial legal databases. Ever since the LII system (Legal Information Institute) got rolling, the writing has been on the wall for fee-based online caselaw databases — how much longer can you charge a price for what a competitor is giving away free?

The answer lies in value-add, which is where I think the really interesting developments will emerge. What will be the killer app for online legal research? At SLAW, Wendy suggests commentary and analysis, Laurel recommends a winnowing function, and Simon C suggests citation frequency tracking — all excellent ideas that an enterprising database provider should move on right now.

My contribution is the idea of a Digg-like function that would allow those viewing a case to determine how helpful it had been to previous readers in a given subject area. It would harness the wisdom of crowds to help determine what is and isn’t an important case. It could adopt the simple Digg click approach, or the slightly more detailed Amazon “Was this review helpful to you?” five-star format, to let users signal whether a given case is worth future researchers’ time. It’s not that far off from the old library rule that a well-worn book with marked pages and wrinkled binding shows its heavy use and utility to those who have come before.

But what I especially find appealing about this idea is that it would help bring about the democratization of caselaw selection. During my time as editor of The Lawyers Weekly, I discovered something important about front-page news: it’s arbitrary. As a news consumer, I had accepted the unspoken presumption that what a newspaper placed on its front page, above the fold, was the most important news of the day. Then I was put in charge of choosing what would run above-the-fold-on-front. I chose front-page stories, and cases to be reported on, for a variety of reasons, and precedential significance was only one of them. Take a look at your local paper for confirmation that what’s on top of page one isn’t what you’d necessarily agree is the top story. Ditto for what leads off the newscast, local or CNN.

The same goes for the printed law reports that all of us (save the newest arrivals to the profession) grew up with. Who decides what gets reported and what doesn’t? One person, or a small handful of people, who may or may not have viewpoints, interests or biases that affect their choices. With every case now online, and tagging systems increasingly sophisticated, there’s no reason to keep assigning the editorial function to an elite few. The crowdsourced approach to online caselaw rating allows the entire legal community to weigh in on whether a given decision is important, and why. Given the choice between the expert and the crowd, I’d like to hear from the crowd.

It’s the natural next step towards an overall collaborative approach to legal research. Thanks to JD Supra, we can already see what a collaborative precedent and document database looks like. What will come next? Collective annotation of key statutes through a wiki? A multiplicity of online law reviews like The Court? More law school case summary services like Twistlaw? The discussion about the future of legal research won’t center around the commercial providers much longer. It will center around which free, collaborative sites create the best ways for lawyers and legal professionals to collectively improve everyone’s ability to find the legal information they need.


  1. Steve Matthews

    Jordan, I like your idea, but wonder if this type of system could/would be ‘gamed’? And if so, to whose benefit?

    It would be easy enough to set up, but how much value would it deliver? Perhaps a closed community of the legal profession, rather than a free-for-all?

    The other question I would have is if it would be an improvement over case law citation. One would think the cases lawyers cite before the courts would be a pretty tough methodology of authority to improve on. Do you think a ‘wisdom of the crowds’ concept would be fair better? or maybe it would simply offer an alternative & more democratic measure to value case law? I think I’d be more inclined to see it as an alternative measure concept. You?

  2. Jordan Furlong

    Steve, these are good questions. I don’t really see the system getting gamed because the motivation to do so in this case would be close to zero. A lawyer rating system like Avvo is extremely gaming-prone, but there’s nobody out there with sufficient interest in illicitly boosting Hunter v. Southam over R. v. Oakes as the prime Charter ruling. I suppose there could be an orchestrated prank, a Rory-Fitzpatrick-like effort to vote heavily for some obscure NBQB ruling, but I would think that would require swuch unorthodox voting patterns that it would stand out and be rectified as an extreme outlier. For that reason, I would think a closed lawyer community wouldn’t be necessary — and maintaining such a community has its own challenges and costs in any event.

    I could accept the argument that citation in a published court decision is a more effective measure of a case’s utility only insofar as courts themselves are the eventual target market for much legal research, and that therefore, if you want to impress a judge, you’d better show her a case that impressed other judges. I’m open to that purely pragmatic approach. But I’m not yet prepared to accept that judges — or their clerks, who do a lot of the actual research — are inherently that much better at assessing the value of a given court ruling than is the legal profession at large. Judges have the same information biases, preferences and hangups as the rest of us, with the added difficulty that they’re choosing among decisions written by colleagues, friends, or appellate judges who’ll be reviewing their work. In a sufficiently large and diverse pool of caselaw Diggers (which I grant you is a prerequisite to making the system work), those biases tend to get washed out. Moreover, judicial citations can eventually become something of a closed shop — the same rulings keep showing up all the time in multiple decisions — and it’s harder for a new entry to make its way in. I’m willing to bet there’s a tremendous amount of insight to be found in judgments that populate the long tail of the caselaw population, but that don’t show up in the elite circulation of most cited cases.

    All that said, I would concede that using a Digg-type system might function best, at least to begin with, as an alternative, complementary system to straight caselaw citation by judges. Judges (and their clerks) do know their stuff — I just think that the profession at large knows its stuff too, and if given the chance through a system like the one envisioned, could show it.

Leave a reply