Conference season is here, and by mid-October I will have spent 15 out 30 days at, or travelling to and from, foreign conferences. That inevitably reduces the number of articles I can write.
It may be helpful, however, if I provide links to some of the resources which have come my way recently which cover the use of technology-assisted review, the reasons offered for not adopting it, and signs of court acceptance of it. These things are obviously linked – if courts are focusing on the use of technology, then perhaps the lawyers might take a look at it. In the interests of time (yours as well as mine) I give their links with little commentary from me.
[The picture, incidentally, is meant to convey simultaneously the flying I have been doing and the accelerated jump through the clouds to blue skies if you use technology properly]
Attorney algorithm aversion and infatuation from Maura Grossman
Maura Grossman is a contributor to a series of short talks produced by Georgetown Law CLE and Exterro, one set of which is here. Her talk, called Attorney algorithm aversion and infatuation, considers lawyer resistance to algorithms that have been shown to be effective and efficient, and over-reliance on algorithms that are unproven and dangerous.
There are two other talks in the same webcast, one from Ralph Losey on proportional document review and the art of cost estimation, and one from Jeffrey Klingporn on technology and trust.
Adam Kuhn of OpenText on prioritised review with AI
Adam Kuhn wrote on the OpenText blog an article with the title Prioritised review with AI endorsed by a federal court. Its key sentence so far as I am concerned is this:
Wouldn’t you want to know the most important facts of your case up front, instead of being surprised at the end of your review project?
This is a variant on a long-standing expression of mine running something like this:
Who would turn down the opportunity to see the apparently most important documents first?
The theme, you will deduce, is prioritisation of review. The article also covers the long-running debate about whether it is appropriate to cull a data set before applying machine learning algorithms to it, and about transparency between parties when technology of this kind is used.
Kelly Atherton of NightOwl on how TAR works and why it matters
Kelly Atherton is Director of Analytics and Manage Review at NightOwl Discovery. I have interviewed her a couple of times and can understand why NightOwl’s clients appreciate her ability to describe complex processes succinctly.
In an article in the UK-based New Law Journal, Kelly Atherton discusses how TAR works and why it matters for legal professionals. Among other things, she describes the difference between the older and newer approaches to technology-assisted review (TAR 1.0 and TAR 2.0), and gives some useful “top tips” to improve efficiency, save time, reduced cost.
Jonathan Maas – don’t shoot the AI puppy!
Jonathan Maas is well-known as an authoritative figure and commentator on the use of technology in disputes. In an article on the Artificial Lawyer called Don’t shoot the AI puppy!” he gives us a readily-understood explanation as to why errors attributed to technology are more usually properly attributed to human error “probably confounded by good old ignorance”. The technology, he says, “….is a slave to its human masters. It is repetitively taught what to retrieve by those masters until it satisfies them that it can largely be left alone to do its job correctly.”
Jonathan Maas’s analogy of a puppy sent off to find red or blue balls is a helpful one. His conclusion is that it is not enough “to hit the predictive coding “Go” button and, in effect, [let] the puppy [run] all over the yard uncontrolled”.
It is perhaps worth adding that I know of only one case in England and Wales where the technology itself was to blame, at least in part, for a significant eDiscovery failure.
Greg Bufithis on the role of perception in adoption of technology-assisted review in eDiscovery
Greg Bufithis, picked up the theme in an article called What’s really wrong with technology-assisted review in eDiscovery? Perception. He considers more widely (and deeply) the reasons why lawyers have been slow to adopt technology-assisted review. There is, he says, an expectation of “near instantaneous response times” and a kind of assumption that one can simply point technology at a large volume of data and expect it to do the rest. He quotes Jonathan Maas
in the law we are not looking for tins of paint to buy, or hotel rooms to book. We’re reading incredibly nuanced documents written in highly technical language particular to a specific business, or incredibly complex information referred to in very relaxed, colloquial wording. The technology works. It just needs proper human input, monitoring.
This does not do full justice to the Greg Bufithis article which warrants closer reading – as I said in opening, my purpose here is simply to point you to things you may find interesting.
Summary
Yes of course, the use of sophisticated analytics requires some skill, and the involvement of somebody who knows what they are doing (that is, has been there before and thought about it). The conclusion, however, should not be “It’s all over my head, I will leave it for now” but to consider the alternatives.
There are some very large volumes of data out there (and “very large” is a term which is perhaps relative to the size of firm – 10,000 documents seems trivial to one firm but enormous to another).
You could just sit down and read them in date order or in whatever order they come to hand, employing (and therefore paying, and charging for) a lot of fairly expensive people. You could do what Jonathan Maas suspects was done in the United case and ‘hit the predictive coding “Go” button’ and hand the results in unchecked to the other side. You could, in extremis, back out of this area of work. You could sit and complain that technology is taking work out of the hands of honest lawyers.
None of this seems an adequate response from intelligent people who seek to apply their brain and hard-won expertise to problems which the clients need to have solved. You might instead acquire the skills yourself or employ someone who does, or engage the services of one of the many providers whose business is supplementing your legal skills with their technology skills.
Increasingly, the court expectation is that you will do one of these things.
Court expectations
Here are three links (again without comment) which suggest that the rules (as well as professional standards) expect lawyers to get to grips with this stuff.
The New York State Commercial Division (the state business trial court) now has a TAR-related rule reading as follows:
The parties are encouraged to use the most efficient means to review documents, including electronically stored information (“ESI”), that is consistent with the parties’ disclosure obligations under Article 31 of the CPLR and proportional to the needs of the case. Such means may include technology-assisted review, including predictive coding, in appropriate cases. The parties are encouraged to confer, at the outset of discovery and as needed throughout the discovery period, about technology-assisted review mechanisms they intend to use in document review and production.
There is an article on the NY Commercial Division Blog about this. Thanks to Maura Grossman for pointing me to the text.
The Supreme Court of Queensland
Practice Direction Number 18 of 2018 in the Supreme Court of Queensland is headed Efficient conduct of civil litigation. Among other things it provides for the use of technology “where possible to achieve efficiency”, the avoidance of unnecessary costs in relation to documents “which are not directly relevant to the real issues in dispute”, proportionality, conferring between the parties, focus on the real issues in dispute, and court intervention and supervision.
Much of the wording will be familiar to those who have seen parallel rules in other Australian jurisdictions, and in the UK, the US, Hong Kong and Ireland. The point is not that any of this is new but that the obligations to use (or at least consider and discuss) technology are becoming enshrined in rules around the world.
Thanks to Geoffrey Lambert for pointing me to this.
The new disclosure rule in England and Wales
This now exists as an approved draft and will go into its pilot in January 2019 (I wrote briefly about it here). You can get to the linked documents from this page on the Judiciary website. Have a look at page 16 of the Draft Disclosure Review Document and the requirements as to the use of analytics generally, and technology-assisted review more specifically.
The points made in the other articles above may suddenly seem more relevant.