Attorney Admits to Using ChatGPT for Case Research

ODSC - Open Data Science
2 min readJun 2, 2023

A lawyer out of New York is in hot water after his firm used ChatGPT for legal research. According to a report by the BBC, the lawyer who used the tool told the court that he was “unaware that its content could be false.”

As many avid users of ChatGPT know, the LLM, at times, can produce false information called hallucinations. The judge who was presiding on the case in question said that the court was now faced with an “unprecedented circumstance” that the filing that used ChatGPT references legal examples that didn’t exist.

The case where the problem came to pass involves a man who is suing an airline over an alleged personal injury. In perpetration of the case, the legal team hired submitted a brief that cited several court cases to lay out their proof while citing past precedent.

But there’s a problem. The opposing lawyers representing the airline went through the brief, and found several of the cited cases did not exist. Judge Castle, the judge in charge of the trial, when given the information demanded the legal team representing the plaintiff to explain themselves.

The judge said in part, “Six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations.” Before you think that this was the work of a rookie lawyer fresh out of law school, the truth is, it wasn’t.

According to the same report, it came out that a lawyer with 30 years of experience used who ChatGPT to look for similar cases and didn’t cross-reference the cases generated by the AI. In a reply, Mr. Schwartz said that he “greatly regrets” leaning on the chatbot to produce his legal research.

Because of this, both lawyers involved in the Plaintiff’s case now have to explain to the court why they shouldn’t be disciplined for their actions at a June 8th hearing. Though LLMs such as ChatGPT, Google’s Bard, and others are often credited for their ability to recall information, they have also brought concerns due to the hallucinations they produce.

In one such case in Australia, a mayor was referenced as a criminal in a case when in fact he was not. This has led to a defamation lawsuit against OpenAI for what their LLM generated in reference to the main.

Originally posted on OpenDataScience.com

Read more data science articles on OpenDataScience.com, including tutorials and guides from beginner to advanced levels! Subscribe to our weekly newsletter here and receive the latest news every Thursday. You can also get data science training on-demand wherever you are with our Ai+ Training platform. Subscribe to our fast-growing Medium Publication too, the ODSC Journal, and inquire about becoming a writer.

--

--

ODSC - Open Data Science

Our passion is bringing thousands of the best and brightest data scientists together under one roof for an incredible learning and networking experience.