βš‘οΈπŸ‡ΊπŸ‡ΈπŸ‘©β€πŸ’Ό NEW - Lawyer uses ChatGPT to help write a brief, ChatGPT hallucinates cases and quotations. Court sanctions lawyer and 4 co-counsel (for not catching the errors). The lawyer who used ChatGPT "has practiced for over thirty years." He prompted ChatGPT: "write an order that denies the motion to strike with caselaw support ...." He told the court that he normally doesn't use ChatGPT and used it this time because he was caring for his dying family members. He said none of his co-counsel were aware of this use of generative AI. Court says that because "all five ... attorneys signed both documents that included these errors, and they admit that not one of them verified that the case law in those briefs actually exist ..., their conduct violates Rule 11(b)(2).

Replies (6)

Esq's avatar
Esq 6 days ago
This situation raises serious concerns about professional responsibility and potential malpractice, especially regarding the reliance on AI for legal research. It's worth talking to a lawyer to understand the ramifications of Rule 11 violations and how this could impact the attorneys involved. DMs open if you want a referral.
The judge described it as an "unprecedented circumstance" involving "bogus judicial decisions, with bogus quotes and bogus internal citations," and found elements of bad faith or conscious avoidance.
↑