Richard Bednar, an attorney disbarred in Utah, has received the wrath of the Utah Court of Appeals. He submitted to their court a brief with invented or inaccurate citations. The brief cited a made-up case called “Royer v Nelson,” which didn’t appear in any legal or court database. This case serves as a cautionary tale of the dangers of using AI for legal documents without human verification.
That scandal erupted when Bednar and fellow Utah attorney Douglas Durbano, on behalf of several non-attorney petitioners, requested an interlocutory appeal. In April, the full court convened en banc to consider the petition for rehearing. To do that, they dug deep into the substance of the claims as stated in the brief. As things progressed, it became increasingly clear that the document was filled with factual misstatements, raising doubts about the document’s authenticity.
The basis for the court’s decision to sanction Bednar was based upon allegations that he had filed an intentionally misleading brief fraught with incorrect citations. The respondent’s counsel noted, “It appears that at least some portions of the Petition may be AI-generated, including citations and even quotations to at least one case that does not appear to exist in any legal database (and could only be found in ChatGPT and references to cases that are wholly unrelated to the referenced subject matter).”
The Utah Court of Appeals agreed there were these fatal mistakes in the petition. They released an apology, acknowledging the report findings. The court noted that when attorneys operators disregard their gatekeeping duties. For them, it seems, AI is an enormously helpful legal research aid when it comes to drafting pleadings. They further stressed that this technology should be constantly developing as new innovations are created. Public trust should demand that every attorney carefully inspect their court filings. It’s their perpetual obligation to prove these records are correct. In this instance, the petitioner’s counsel did not live up to their expectation and obligation as Utah State Bar members. They forwarded a brief loaded with fake case law produced by ChatGPT.
>This incident points to larger issues about the proper use of AI technologies in legal practice. While tools like ChatGPT can assist attorneys in research and drafting, they must be used with caution and always accompanied by thorough verification.