Written by: Justine M. Ware
ChatGPT is an artificial intelligence ‘chatbot’ developed by OpenAI and is designed to mimic human conversation and writing. Since its prototype was released in November 2022, ChatGPT has grown exponentially. In January 2023, the platform was touted to be the fastest-growing consumer application in history with an estimated 100 million monthly users. Now, through a monthly subscription service for a small fee anyone can utilize ChatGPT to generate articles, poetry, essays, and even jokes. But can it help lawyers generate successful motions or briefs?
The short answer is no. Unfortunately, ChatGPT is not ready for the legal world. The creation of artificial intelligence has influenced the idea that there is some program or some software that is all-knowing. It has access to, presumably, all facets of the Internet and can sift through information at a speed unattainable by humans. However, AI as a concept is in its beginning stages and is certainly fallible.
A New York lawyer has learned the hard way that ChatGPT is not as reliable as it may seem. The Manhattan lawyer was sanctioned $5,000 for a brief filed in federal court that was co-authored by ChatGPT. The brief cited six fictitious court decisions. This error begs the question: is this the fault of the computer or the fault of the lawyer? The American Bar Association’s Model Rules of Professional Conduct do not mention artificial intelligence. However, the ethical rules may still apply.
According to a May 30, 2023, Reuters article by Karen Sloan, the Rule of Competency may be implicated in this situation. This rule requires lawyers to possess legal knowledge, skill, thoroughness, and preparation necessary to adequately provide legal representation. Thoroughness requires an attention to detail which should be involved in writing any legal memoranda or brief. All I’s must be dotted, and all T’s crossed. When one begins to rely not on their own intelligence, but the artificial intelligence of computer software there becomes great room for error.
Similarly, the Rule of Diligence may apply. As lawyers, it is our duty to explore every avenue for argument and conduct any-and-all research necessary to make those arguments. While AI is intelligent, it is clearly not diligent. Artificial Intelligence cannot be expected to understand the nuances of the law or to decipher complex factual scenarios. However, lawyers have been trained in just that for years. We owe a duty to our clients to remain diligent and to do the painstaking fact-checking and legal research that is required for a successful pursuit for justice.
The outcome of this unfortunate event in New York remains unclear, but until then lawyers should remain wary of artificial intelligence. It is a new and shiny toy that can bring a lot of fun and information into our lives. However, ChatGPT needs to be avoided in professional settings, especially those which involve the judicial system. ChatGPT is not better than lawyers at least for now.