Lawyer Facing Discipline After Using ChatGPT To Cite Non-Existent Case

Updated: Published by:

On Tuesday, January 30, 2024, the U.S. Court of Appeals for the Second Circuit referred attorney Jae Lee to its attorney grievance panel after Lee admitted to citing a non-existent case in a brief.

Lee represented Minhye Park in a medical malpractice lawsuit against David Dennis Kim. The U.S. District Court for the Eastern District of New York dismissed Park’s case against Kim “for her persistent and knowing violation of court orders.” Park appealed the decision to the U.S. Court of Appeals for the Second Circuit. Lee submitted an appellate reply brief citing Matter of Bourguignon v. Coordinated Behavioral Health Servs., Inc., 114 A.D.3d 947 (3d Dep’t 2014). The Second Circuit was unable to locate said case, and Lee was ordered to submit a copy of that decision to the Court. Lee filed a response stating “that she was ‘unable to furnish a copy of the decision.'” The case in question does not exist. Lee’s response stated: “I utilized the ChatGPT service. . . for assistance in case identification. . . The case mentioned above was suggested by ChatGPT, I wish to clarify that I did not cite any specific reasoning or decision from this case.”

The panel of Second Circuit judges referred Lee to its attorney grievance panel for citing the non-existent case. The panel relied on Federal Rules of Civil Procedure Rule 11 “requir[ing] that attorneys read, and thereby confirm the existence and validity of, the legal authorities on which they rely.” The panel went on to clarify that while several courts have proposed or enacted local rules or orders specific to the use of artificial intelligence, “such a [local] rule is not necessary to inform a licensed attorney, who is a member of the bar of this Court, that she must ensure that her submissions to the Court are accurate.” The panel determined that “Lee made no inquiry, much less the reasonable inquiry required by Rule 11 and long-standing precedent, into the validity of the arguments she presented.”

Lee is the latest example of a lawyer being disciplined for relying on ChatGPT and other generative AI programs. One of Michael Cohen’s lawyers is seeking to avoid sanctions for filing court papers including fake case citations generated by Google Bard. Last year, two New York lawyers were sanctioned for filing a brief that included citations generated by ChatGPT. Similarly, a Colorado lawyer was suspended last year for filing a motion that included non-existent case law citations generated by ChatGPT.

Additional Reading

Another NY lawyer faces discipline after AI chatbot invented case citation, Reuters (January 30, 2024)

Opinion in Park v. Kim, No. 22-2057 (2d Cir. 2024)

Photo Credit: Sansoen Saengsakaorat / Shutterstock.com