Crowdsourcing legal research

A terrific discussion is underway at SLAW, prompted by news of a new Canadian online research service, about the future of commercial legal databases. Ever since the LII system (Legal Information Institute) got rolling, the writing has been on the wall for fee-based online caselaw databases — how much longer can you charge a price for what a competitor is giving away free?

The answer lies in value-add, which is where I think the really interesting developments will emerge. What will be the killer app for online legal research? At SLAW, Wendy suggests commentary and analysis, Laurel recommends a winnowing function, and Simon C suggests citation frequency tracking — all excellent ideas that an enterprising database provider should move on right now.

My contribution is the idea of a Digg-like function that would allow those viewing a case to determine how helpful it had been to previous readers in a given subject area. It would harness the wisdom of crowds to help determine what is and isn’t an important case. It could adopt the simple Digg click approach, or the slightly more detailed Amazon “Was this review helpful to you?” five-star format, to let users signal whether a given case is worth future researchers’ time. It’s not that far off from the old library rule that a well-worn book with marked pages and wrinkled binding shows its heavy use and utility to those who have come before.

But what I especially find appealing about this idea is that it would help bring about the democratization of caselaw selection. During my time as editor of The Lawyers Weekly, I discovered something important about front-page news: it’s arbitrary. As a news consumer, I had accepted the unspoken presumption that what a newspaper placed on its front page, above the fold, was the most important news of the day. Then I was put in charge of choosing what would run above-the-fold-on-front. I chose front-page stories, and cases to be reported on, for a variety of reasons, and precedential significance was only one of them. Take a look at your local paper for confirmation that what’s on top of page one isn’t what you’d necessarily agree is the top story. Ditto for what leads off the newscast, local or CNN.

The same goes for the printed law reports that all of us (save the newest arrivals to the profession) grew up with. Who decides what gets reported and what doesn’t? One person, or a small handful of people, who may or may not have viewpoints, interests or biases that affect their choices. With every case now online, and tagging systems increasingly sophisticated, there’s no reason to keep assigning the editorial function to an elite few. The crowdsourced approach to online caselaw rating allows the entire legal community to weigh in on whether a given decision is important, and why. Given the choice between the expert and the crowd, I’d like to hear from the crowd.

It’s the natural next step towards an overall collaborative approach to legal research. Thanks to JD Supra, we can already see what a collaborative precedent and document database looks like. What will come next? Collective annotation of key statutes through a wiki? A multiplicity of online law reviews like The Court? More law school case summary services like Twistlaw? The discussion about the future of legal research won’t center around the commercial providers much longer. It will center around which free, collaborative sites create the best ways for lawyers and legal professionals to collectively improve everyone’s ability to find the legal information they need.


Comments are closed.