> "I strongly believe that law students and junior lawyers need to understand these AI tools, and other technologies, that will help make them better lawyers and shape future legal practice," Buell told Mashable in an email.<p>I am a law student and that is not what is going through my head. What I see is a tougher job market, justifiably. Understanding, and more importantly, developing legal tech will help me make to be a better lawyer; since I am not already on top of the food chain, we (as graduates) will be eaten away.<p>Also: it is important to keep in mind that this is strictly contract law. Criminal and maybe to some extent tort law will be in the hands of real people for some foreseeable time, at least in regards to representation. Representation must not be strictly seen in a legal manner. As a lawyer you are also a fellow human with emotions, who must weigh pros and cons specifically adapted to your client. And that, right now, is a human thing only.<p>Research and drafting documents - that I believe will also be dominated by AI. Simply because economies of scale and cost/effectiveness ratio.
"The AI also completed the task in 26 minutes, while the human lawyers took 92 minutes on average", should be 26 _seconds_, according to the infographic on the source page: <a href="https://www.lawgeex.com/AIvsLawyer/" rel="nofollow">https://www.lawgeex.com/AIvsLawyer/</a>
The AI was clearly faster, which is no surprise. But since the "correct" answers for the test were assigned by one group of humans, and the correct answers in real practice are also determined by (a different group of, compared to either those running the test or those taking the test) humans, I'm not sure that result that the AI was more correct can be in any sense meaningful.
We are evaluating similar software for use during due diligence review in investment and M&A transactions at my firm. We have found that the software is good for identifying when contracts have certain troublesome clauses such as restrictions on assignments in an asset sale. These tools certainly save time and make document review faster and more accurate.<p>The biggest issue we have found is that this type of review does not identify when contracts are <i>missing</i> key terms. For that, we still need someone with experience in the relevant field thinking about the business context of the agreement and the potential risks.
TLDR: AI replaces lowest-cost portion of legal practice, AKA "document review."<p>For those of you familiar with the legal field, document review involves reading and summarizing documents. This task is left to new graduates because they don't have the skills or experience to actually do anything useful.<p>Looking at LawGeex website, it appears that all the AI does is heuristically match common terms, phrases, and cases to guess what legal issues are raised in the document reviewed. It certainly makes the process of document review more efficient, but it's not revolutionary.
I wish such a technology could be in hands of ordinary people, so that one wouldn't need an expensive lawyer to spot a contract that is being unfair to him.
I have to be honest, I'm a bit suspicious considering the study was conducted by the company which developed the AI and that the academic auditors are all law professors and not AI professors.
The AI could really help in due dilligence which is usually left for the youngest associates/trainees. By the nature of the work, you have to review and summarize thousands of documents. And as this process takes a long time, you cannot charge the standard rates to the client. Instead you work overtime without charging anything thus receiving no extra compensation.<p>I remember doing a due dilligence work during one of my summer internships, where my only job was to scan documents for any non-spanish documents. If the document was in spanish and contained no suspicious names, I closed it. If not, I printed it out. I had to do this for hours and hours in the night.<p>I think AI would be a great fit for such tasks.
This part of the article goes with my general:<p>> So does this spell the end of humanity? Not at all. On the contrary, the use of AI can actually help lawyers expedite their work, and free them up to focus on tasks that still require a human brain.<p>Although what that does not point out is it will still cost jobs just not "all" the jobs. It will soon take less people to do the same job and unless the demand grows or the demand currently vastly outpaces supply, companies will need to downsize.<p>There is a best case. It could be that no one loses their job and it means Doctors, Lawyers, and Engineers can start working 40 hour weeks and have a work live balance.
Discovery has been automated for a many years (and FWIW this was how HP got interested in Autonomy -- which turned into a scam). This is moving up the stack a bit with some "AI" clickbait added.
With the increasing use of AI in law, there will be an incentive to make the texts more tractable for AI. Perhaps this may lead to the changes in the law language, eliminating the remaining ambiguities.
A "new study" AKA a marketing gimmick dutifully reported as if it were the truth from God's lips by press release dumping ground websites. One important point: are the bots liable for malpractice, or no? That's one of the things that you pay for when you pay for a lawyer: you pay them to put their necks on the line, also.
What about false positives? False negatives? What about when different legal language is used? How does punctuation affect this? What about differing laws per jurisdiction?<p>The article is great for an instant hit of wonder about how great AIs are, but an attorney still needs to review the output of the AI.
This is one of the reasons I think Agrello will be a successful project, as it aims to have an AI engine process legal contracts into smart contracts. Granted it currently still currently requires a lawyer of a particular region to create an interpretive template.
I've always felt that a well-written contract should "compile" without errors and warnings.<p>Instead, in dealing with even simple business contracts, I routinely see things that contradict each other, or are open to interpretation.<p>There needs to be a good validation process for contracts.
AIs are terrible at open-ended questions, and interpretation of law is usually open-ended. In this experiment they had pretty well-defined structure and evaluation criteria, which sets the AI up to be successful, whereas in real life there is rarely such definition.
I'm gonna go full populist here and say that instead of making AI really smart so that they can read a contract, we should make AI as smart as an average human and discard the contracts it can't understand.
> This technology will never fully replace a human lawyer, but it can certainly speed up their work by highlighting the most important sections of a story.<p>That's optimistic thinking but AI can and probably will replace lawyers and doctor's at some point. At least a large majority. All legal consulting could easily be replaced, courtroom lawyering will probably be the last to go because a lot of that is emotional to sway jurors one way or another..<p>Unless the jurors are replaced by ai as well...