EastIdahoNews.com is partnering with local attorneys to bring you information about legal topics that readers often have questions about. These weekly columns will run each Sunday afternoon. Anyone with a question for the attorneys should email nate@eastidahonews.com.
In June of 2023, a U.S. District Court judge sanctioned two New York attorneys for submitting a legal brief that cited six fictitious cases generated by ChatGPT. Then, in November 2023, a Colorado attorney was suspended from practicing law for a year after using ChatGPT to draft legal pleadings that contained fabricated cases. Recently, a South Carolina man has been accused of pretending to be a lawyer while using AI to draft his legal pleadings.
These are just some of the many instances where attorneys and non-attorneys have run into trouble using artificial intelligence (AI) in the legal profession. Despite this, AI in the legal profession isn’t going away any time soon. If anything, using AI to assist in legal research or draft pleadings appears to be the way of the future.
Case Research
Historically, lawyers conducted legal research using books and legal treatises. This practice was revolutionized with the advent of the internet. But even then, lawyers had to take classes in law school dedicated to learning how to perform proper boolean searches to find relevant case law. Like the internet, AI has further revolutionized this process.
Both Westlaw and LexisNexis offer AI case search options. Instead of creating a perfect boolean search string, the lawyer must ask the AI questions. The AI will then generate a report with case citations to answer the question. As a result, AI can get answers quicker than traditional search methods and can find cases that might have been overlooked due to insufficient search parameters.
Despite being faster, AI can produce incorrect or even false information. Called “hallucinations,” the AI can create case law out of thin air. Or worse, AI has the propensity to cite unrelated and irrelevant sources in an answer to your question.
These AI “hallucinations” got the attorneys in trouble in New York and Colorado. Had these attorneys taken the time to review the information provided by the AI, they would have discovered that certain cases did not exist. By failing to do so, they were disciplined by the court.
I have encountered several AI hallucinations in my practice. In one instance, the AI reported that a particular statute of limitations applied to my case. This surprised me as I was unaware of the statute cited as authority by the AI on this issue. After digging deeper into the AI’s citations, I discovered that it had pulled a statute from an entirely different body of law that applied only in specific circumstances. Had I not double-checked the AI’s report and cited the statute supporting my argument, opposing counsel would have had a field day ripping my briefing apart.
And therein lies the problem with AI at this time. Yes, AI has improved the speed an accuracy of legal research. However, a lawyer must always verify the AI’s findings to protect against inaccuracies and hallucinations.
Drafting
All AI is based on a large language model or LLM. LLMs contain a vast amount of data that trains the AI and from which it pulls its information. For example, ChatGPT’s LLM has over 175 billion data parameters upon which it was trained. But LLMs need not be large and all-encompassing. Instead, LLMs can be limited to one or several documents. These small LLM sets can benefit attorneys when reviewing information and drafting documents.
For example, a lawyer can upload a single contract and ask the AI to summarize its contents or answer specific questions. AI can even provide summaries and time stamps from videos, such as police body camera footage, saving the attorney (and the client) several hours of work.
In drafting, a lawyer can use his entire library of existing documents to create a new document with specific, unique parameters. Of course, the newly created document is rarely complete. As a result, the AI is better used as an outlining tool, pulling from an attorney’s existing library, thereby increasing efficiency.
AI is a tool, not a replacement.
At this time, AI is best used as a tool for the lawyer as opposed to being his replacement. Even when AI improves and its associated LLMs are refined, attorneys will still be needed to advocate for their clients and handle all the various human factors of the law, such as equity and fairness.
Source: https://www.eastidahonews.com/