Technology

A Man Sued Avianca Airline. His Lawyer Used ChatGPT.

The lawsuit started like many others. A man named Roberto Mata has sued the airline Avianca for injuring his knee when he was hit by a metal catering cart during a flight to New York’s Kennedy International Airport.

When Avianca asked a federal judge in Manhattan to dismiss the lawsuit, Mr. Mata’s lawyers vehemently opposed it, filing a 10-page brief citing more than six relevant court decisions. Martinez vs. Delta Air Lines, Zickerman vs. Korean Air and, of course, Bargeese vs. China Southern Airlines also led to learned debates about federal law and the “massive consequences of automatic suspension of statutes of limitations.”

There was only one problem. None of the airline’s lawyers, not even the judge himself, was able to find the citations cited and summarized in the judgments and briefs.

Because ChatGPT invented it all.

The attorney who drafted the brief, Steven A. Schwartz of Levydor, Levydor & Overman Law Firm, threw himself in court on Thursday and said in an affidavit that artificial intelligence was being used to conduct legal investigations. You said you used the program. The source was revealed to be unreliable. ”

Mr. Schwartz, who has practiced as a lawyer in New York for 30 years, told Judge P. Kevin Castell that he had no intention of deceiving the court or the airline. Schwartz said he had never used ChatGPT and was “thus unaware of the potential falseness of its content.”

He told Judge Castel that he had even asked the show to verify the authenticity of the case.

It said yes.

Schwartz said he “greatly regrets relying on ChatGPT and will never rely on it in the future unless we fully verify its reliability.”

In his order, Judge Castel said he was confronted with “unprecedented circumstances” for legal filings full of “fake judicial rulings, including false citations and false internal citations.” He ordered a public hearing on June 8 to discuss possible sanctions.

As artificial intelligence swept the online world, it evoked a dystopian vision of computers replacing not only human interaction but also human labor. This fear is especially acute for knowledge workers, many of whom worry that their day-to-day activities may not be as lean as the world thinks, but at the price of which the world is paying for their billable time. are paying for

Steven Gillers, professor of legal ethics at New York University School of Law, said the issue is particularly acute among lawyers, who are aware of the value and dangers of AI software like ChatGPT and all the information it provides. He said they were discussing the need to verify. .

“What is currently being debated among attorneys is how to get around exactly what this case describes,” Gillers said. “You can’t just cut the output and paste it into a court filing.”

The actual Roberto Mata v. Avianca case shows that there may be at least some time left before white-collar occupations are replaced by robots.

The lawsuit alleges that Mata was a passenger on Avianca Flight 670 from El Salvador to New York on August 27, 2019 when an airline employee beat him with a catering cart. After Mr. Mata filed the lawsuit, the airline filed papers seeking to dismiss the lawsuit, saying the statute of limitations had expired.

In a brief filed in March, Mr Mata’s lawyers said the case should go on, drawing on and citing numerous court rulings that have since been debunked to strengthen their case. .

Avianca’s lawyers soon wrote to Judge Castel saying they could not find the case cited in the brief.

Regarding the Varghese v China Southern Airlines case, he said, “I could not find this case by caption or citation, nor could I find a case similar to it.”

They pointed to a lengthy quote from the alleged Varghese judgment contained in the brief. An attorney for Avianca Airlines wrote, “The undersigned has not been able to find this quote or anything like it in any case.”

In fact, the lawyers say the quote is from Varghese himself, citing an opinion purportedly handed down by the U.S. Court of Appeals for the 11th Circuit in 2008, called Gickerman v. Korean Air. added. Didn’t find that either.

Judge Castel ordered Mr. Mata’s attorney to provide a copy of the opinion referred to in the brief. Attorneys submitted eight abstracts. In most cases, the court and judge who issued the casebook, the number and date of the casebook were listed.

For example, the purported copy of the Varghese decision is six pages long and purports to have been written by a panel of three judges for the 11th Circuit. However, Avianca’s attorneys told the judge that that and other comments could not be found in court records or legal databases.

Avianca attorney Bert Vanino said his firm, Condon & Forsyth, specializes in aviation law and that his attorneys can determine that the case in the brief is not true. He added that he had a hunch that chatbots might be involved.

Schwartz did not respond to messages seeking comment, and another attorney at the firm, Peter Loduca, who was named in the brief, did not respond.

Loduka said in an affidavit this week that he had not conducted any research into the matter and that he had “no reason to doubt” the veracity of Schwartz’s research or opinions.

ChatGPT generates realistic responses by guessing which parts of text follow other sequences based on statistical models that incorporate billions of text examples from across the internet. In Mata’s case, the program seems to recognize a labyrinthine framework of written legal arguments, but fills it with names and facts from a bouillabaisse of existing cases.

Judge Castel indicated in his order for the hearing that he conducted an independent investigation. He wrote that the clerk of the 11th Circuit Circuit Court confirmed that the docket number printed on the alleged Varghese Opinion relates to an entirely different case.

Judge Castel called the opinion “fake,” noting that it contained internal quotes and quotations that did not exist. He said five other decisions filed by Mata’s lawyers also appeared to be fake.

On Thursday, Mata’s attorneys filed an affidavit containing their interpretation of what happened.

Schwartz initially filed a lawsuit against Mata in state court, but after the airline transferred the case to federal court in Manhattan, Schwartz wrote that he was no longer allowed to operate, and one of his colleagues , Loduka filed a lawsuit. Attorney of Record. Schwartz said it had continued legal investigations, but that Loduka had played no role.

Schwartz said he referred to ChatGPT to “supplement” his own research and “consulted” to find and cite six nonexistent cases. He said ChatGPT gave him peace of mind.

According to a copy of the correspondence provided to the judge, he typed, “Is Bargeese a real case?”

The chatbot replied “yes”, provided a quote, and added, “This is a real case.”

Mr. Schwartz dug deeper.

According to the filing, he wrote, “What is your source?”

ChatGPT responded, “Sorry for the confusion earlier,” citing legal citations.

“Are the other cases you provided fake?” Mr. Schwartz asked.

ChatGPT replied, “No, the other lawsuits I have provided are real and can be found in reliable legal databases.”

Unfortunately, we weren’t able to do that.

Sheerag McNeil contributed to the research.

Related Articles

Back to top button