top of page

We Don’t Use AI at Persaud Law Office

  • Writer: Kyle Persaud
    Kyle Persaud
  • 5 days ago
  • 4 min read

Updated: 4 days ago

In all the time I have practiced law, I have never used artificial intelligence to write a legal brief. This is because there have been so many instances of AI writing legal briefs that contain citations to cases and laws that do not exist. Recently, a business that provides technology support for my firm asked me if I use AI. I told him no. I believe that the clients who pay me deserve better.


Every blogger, it seems, has been writing about AI, so I might as well take my turn. I’m going to tell you why we don’t use AI at Persaud Law Office.

 

The Problem With AI-Generated Legal Briefs


Since AI has become popular, many lawyers are using AI to write their legal briefs. Almost as soon as this practice became widespread, news stories began to emerge about how AI-generated legal briefs contained “hallucinations” – that is, the briefs included cases and laws that did not exist.

 

A Notorious Example: Mata v. Avianca


One of the most notorious cases in which this occurred was Mata v. Avianca, a case in New York. In Mata, the plaintiff’s attorney used ChatGPT to write a brief for him. This attorney, Stephen Schwartz, later admitted that he had never used ChatGPT before, and had heard of ChatGPT only through “press reports and conversations with family members.”


But, as it turned out, six of the cases that ChatGPT cited in its brief did not exist. The opposing attorneys called this to the judge’s attention when they filed a memorandum stating that they could not find any of the cases listed in Mr. Schwartz’s brief. The judge then ordered Mr. Schwartz to submit a new brief, and to attach the actual cases that he had cited. Mr. Schwartz then submitted a brief, to which he attached were what he claimed were “excerpts” of these cases (ChatGPT had generated these “excerpts” as well.)


The judge then contacted the court clerk of the court that had allegedly decided one of the cases Mr. Schwartz had submitted. The clerk confirmed that the court had decided no such case. The judge also searched for the other cases, and found that these cases did not exist either.

 

The Consequences


The judge fined Mr. Schwartz, and his law firm, $5,000. In his ruling, the judge noted that the Federal Rules of Civil Procedure state that a judge may sanction a lawyer if the lawyer includes erroneous points of law in an argument. The judge also quoted the New York Rules of Professional Conduct, which state that it is unethical for a lawyer to make false statements of law to a court. Furthermore, in reference to the text of non-existent cases that Mr. Schwartz submitted, the judge observed that it is a federal crime to forge the signature of a federal judge or to forge the seal of a federal court. The judge stated that the fake court opinions did not include a signature or seal, and therefore, Mr. Schwartz did not commit this crime. The judge wrote, however, that “the citation and submission of fake opinions raises similar concerns” to the issues that the federal criminal law was meant to address.

 

This Problem Isn’t Going Away


Mata is not an isolated incident. This past week, I received an e-mail from the ethics counsel of the Oklahoma Bar Association which stated,

As of today, there are at least 455 documented incidents involving the use of hallucinated case law or erroneous statements produced by AI.”

That number should concern every lawyer who values integrity and accuracy in their work.

 

When AI Came Too Close to Home


The issue of AI hallucinations of legal cases hit close to home for me recently. Another attorney asked me to write a brief for her. I wrote the brief and sent it to the other attorney. The other attorney then e-mailed me back, and in her e-mail she listed a number of court cases that she thought I should include in the brief. I looked up these court cases, and I found that most of the court cases did not exist, and that the ones that did exist did not make the points that the e-mail said they did. None of the cases were of any help to my brief.

 

After reading all this, I confronted this lawyer, and asked her, “Did you use AI to find these cases?” The lawyer admitted that she had. I explained to this lawyer that these cases did not exist, and I sent her an article about AI hallucinations in legal briefs.


Also, just today, I was in court, listening to litigants arguing a case before a judge. The defendant, who was representing himself, had submitted a brief to the court. The opposing lawyer said, “There was only one case listed in the defendant’s brief. The case does not exist. The brief was clearly written by AI.”

 

After hearing the parties argue this case, I downloaded the defendant’s brief and read it myself. I looked up the case the defendant cited. I found that there really was a case where the parties had the same names, but everything else the defendant’s brief said about the case (such as the case citation number, and the details and holding of the case) was fictitious.

 

Those two incidents (the situation with my lawyer friend, and the case I saw being argued in court) show that AI hallucinations of court cases are not isolated incidents that appear only in news stories. Nor are AI hallucinations rare. AI hallucinations are regular, everyday occurrences, and that it why I do not use AI to write briefs.

 

A Tradition of Human Craftsmanship


Lawyers have been writing briefs for centuries before AI ever existed. My position is: If Thomas Jefferson didn’t need AI to write the Declaration of Independence, and Thurgood Marshall didn’t need AI to argue Brown v. Board of Education before the Supreme Court and secure the right of blacks to attend school with whites, why do I need to use AI now? As the old saying goes, “If it ain’t broke, don’t fix it.

 

I have never used AI to write a legal brief. And I don’t plan to ever do so. If you retain the Persaud Law Office as your counsel, you can be assured that humans will have written any document that we file on your behalf.

 
 
 

NOTE: The information provided on this website is not intended to be, and does not constitute, the giving of legal advice. The information provided here is not intended to be, and should not be used as, a substitute for individual reliance on privately retained legal counsel. Information provided on this site may not constitute the most current or complete information with respect to legal topics or developments. Mr. Persaud expressly disclaims all liability based on any information contained on this site.”

© 2025, by Kyle Persaud.

bottom of page