A Lawyer used ChatGPT for legal filing, the AI cited nonexistent cases
For thirty years, Steven Schwartz has been an attorney with Levidow, Levidow & Oberman. Now, a single instance may entirely ruin his career.
Why? He used ChatGPT for his court documents, and Schwartz described how the AI chatbot fully made up the prior cases it used.
It all begins with the disputed Mata v. Avianca case. The New York Times reported that Roberto Mata, an Avianca passenger, was suing the airline for damages after a food cart hurt his knee during a flight. In an effort to get the case dismissed, Avianca asked the judge. In opposition, Mata’s attorneys filed a brief that was chock-full of precedent from cases that were comparable to this one. And ChatGPT stepped in to help.
Schwartz, Mata’s lawyer who filed the case in state court and then provided legal research once it was transferred to Manhattan federal court, said he used OpenAI’s popular chatbot in order to “supplement” his own findings.
Varghese v. China Southern Airlines, Shaboon v. Egyptair, Petersen v. Iran Air, Martinez v. Delta Airlines, and Estate of Durden v. KLM are just a few of the cases ChatGPT gave Schwartz the names of. Miller v. United Airlines and Royal Dutch Airlines.
The issue? All of those examples were entirely made up by ChatGPT. They are nonexistent.
“Avianca’s legal team and the judge assigned to this case soon realized they could not locate any of these court decisions. This led to Schwartz explaining what happened in an affidavit on Thursday. The lawyer had referred to ChatGPT for help with his filing”.
Lawyer used ChatGPT for Legal Filing
It’s crucial to remember that ChatGPT, like all AI chatbots, is a language model that has been taught to obey commands and respond to user inquiries. This means that if a user asks ChatGPT for information, it might provide that user with the precise information they need, even if it isn’t accurate.
Schwartz claimed that the man was “unaware of the possibility that its content could be false.” The attorney even gave the judge screenshots of his conversations with ChatGPT in which he inquired about the validity of one of the cases. That’s what ChatGPT said in response. Even more so, it acknowledged that the instances might be located in “reputable legal databases.” Again, none of them could be located because the chatbot fabricated every single scenario..
Lawyer that used ChatGPT for Legal Filing should endeavor todo more research before relying entirely on chatbot mere findings, these could jeopardize the case, the career and credibility of such lawyer.
The judge has ordered a hearing next month to “discuss potential sanctions” for Schwartz in response to this “unprecedented circumstance.” That circumstance again being a lawyer filing a legal brief using fake court decisions and citations provided to him by ChatGPT.