ChatGPT 'Hallucinated' Court Cases Cited By NY Attorneys in Federal Court Filing
  • 11 months ago
ChatGPT 'Hallucinated' Court Cases , Cited By NY Attorneys , in Federal Court Filing.
ChatGPT 'Hallucinated' Court Cases , Cited By NY Attorneys , in Federal Court Filing.
The case in question was a personal injury suit filed against Avianca Airlines.
The plaintiff's attorney, Peter LoDuca, cited a number of court cases that attorneys for Avianca could not substantiate.
It was later revealed that LoDuca had used ChatGPT to research useful cases to cite.
Federal Judge P. Kevin Castel gave LoDuca an opportunity to admit that the cases were somehow fabricated.
LoDuca allegedly then used ChatGPT to generate the cases that it had initially cited, despite the fact that they were AI fabrications.
In a court filing that could lead to sanctions against offending attorneys, .
... Castel wrote that the initial federal court filing was “replete with citations to non-existent cases.”.
In addition, Castel referred to the attorneys' research as “bogus judicial decisions with bogus quotes and bogus internal citations.”.
The term "hallucinated" has already been coined to refer to instances in which ChatGPT creates information that is not real.
This case highlights a trend across fields that incorporate research in which AI is used without being cross-referenced or checked for errors.
This creates new challenges in complex fields such as law, media and academia, challenges which Castel alludes to in his filing.
The Court is presented with an unprecedented circumstance, Federal Judge P. Kevin Castel, via NBC News
Recommended