[Update 30 April 2017: In an email exchange with Dan Rubins, CEO of Legal Robot, I learned that the New York Times article doesn’t reflect at all how Legal Robot handles shall. I’m OK with having written my post based on what was in the article, but the shall part was what prompted me to write the post in first place. I considered deleting this post, but I’ve decided to leave it up, with this note. If a newspaper of record says something nonsensical about my topic, it’s appropriate for me to say something. Regarding Legal Robot, reliance on EDGAR remains a problem, as it is with other products, but otherwise I have nothing to say about Legal Robot, because I haven’t tried it.]
Thanks to agent provocateur @AlexHamiltonRad on Twitter, yesterday I learned of this article in the New York Times. It’s about how “some companies are … deploying artificial intelligence in the workplace and asking their employees to train the A.I. to become more human.” If you hadn’t noticed, “artificial intelligence” is the latest legal-tech buzzword. One of the five people featured in this article is Dan Rubins, CEO of Legal Robot. Here’s an extract:
Having reviewed nearly a million legal documents, Legal Robot also flags anomalies (strange wording or clauses) in contracts. “Lawyers have had 400 years to innovate and change the profession, and they haven’t done it,” said Mr. Rubins, who is a lawyer. “It’s time for some outside help.”
He said legal documents are well suited to machine learning because they are highly structured and repetitive. Legal Robot tapped a vast trove of contracts prepared by human lawyers in filings with the Securities and Exchange Commission—”a cesspool of legal language,” Mr. Rubins said—as well as past documents from law firms who wanted to help train Legal Robot’s systems.
Mr. Rubins, 33, said that the A.I. is good at identifying potentially vague word choices. He recently received a two-page nondisclosure agreement—it was reviewed by human lawyers—from another company containing the word “shall” 30 times. The A.I. pointed out that “shall” can be vague and advised that “will” or “may” are more clear, depending on the context.
Alex Hamilton predicted I’d “have kittens” on seeing this. Indeed.
If you want to train artificial intelligence, unleashing it on the SEC’s EDGAR system is deeply unpromising. If all you feed it is dysfunctional traditional language, artificial intelligence will never be able to tell you what clear and concise contract language looks like. For all I know, Legal Robot might regard as “strange wording” contract language that complies with A Manual of Style for Contract Drafting.
I’ve written previously about how legal tech repeatedly tries to get to quality contract language by having technology rummage in dysfunction. See for example this 2011 post and this 2015 post. Legal Robot is more of the same.
[Update 30 April 2017: If you haven’t read the note at the top of this post, now would be a good time to do so.]
The Times article offers an instructive example of the practical implications of that, in the form of Legal Robot’s message regarding shall. It’s appropriate to flag overuse of shall, but one thing that shall is not is vague. If you can’t accurately classify the different kinds of confusion to which contract language is subject, you don’t understand contract language. (For more about that, see this 2016 article.)
And saying that will or may are more clear, depending on the context, is deranged. In my world, shall is for language of obligation, will is for language of policy relating to contingent future events, and may is for language of discretion. (Go here for my “quick reference” chart on the categories of contract language.)
Legal Robot’s message was written by a person. So evidently Legal Robot is a combination of artificial intelligence rummaging in the cesspool and messages cobbled together by people who are blissfully unaware of the nuances of contract usages.
Before I read the Times article I did a jokey tweet, prompting the following response from Dan Rubins:
@KonciseD Ha! We almost wrote that message as "shall is the worst form of an obligation, except for all of the others" but decided for more nuance
— Dan Rubins (@DanRubins) April 28, 2017
“Ha!” is as good a way as any to summarize what I think of Legal Robot, based on what I learned from the Times article.