In early 2024, the Missouri Court of Appeals issued a decision that quickly drew national attention. In Kruse v. Karlen, 692 S.W.3d 43 (Mo. Ct. App. 2024), a pro se (self-represented) litigant submitted an appellate brief that appeared, at first glance, to be well supported by legal authority. It included quoted passages, case names, and citations that looked authentic.
They were not. The cited cases did not exist. The authorities had been generated by artificial intelligence.
The court dismissed the appeal and imposed $10,000 in sanctions. The judges made their position clear: courts cannot, and will not, accept filings generated by AI without careful review and verification by a licensed attorney. Similar issues have surfaced in courts across the country, including Oklahoma.
In Mattox v. Product Innovations Research USA (Case No. 6:24-cv-235-JAR, E.D. Okla.), Judge Timothy DeGiusti addressed an AI-related filing failure of a scale rarely seen before the rise of generative tools. The pleadings contained fabricated citations, misquoted authorities, and references to cases that simply did not exist. The court struck the filings and ordered the plaintiffs to pay more than $11,700 in attorney fees—before the first witness was ever called.
The issue in Mattox was not the technology itself. It was the human decision to rely on it without proper oversight. The case reinforces a basic principle: AI cannot take the place of a lawyer. Treating it as though it can puts cases, rights, finances, and futures at risk.
At Ball Morse Lowe, we understand how overwhelming legal problems can feel. But cutting corners—especially by relying on tools that are not designed to provide legal advice—often creates more problems than it solves.
AI Can Inform, but It Cannot Protect
AI tools are appealing because they feel fast and inexpensive. But when someone relies on AI alone, they are relying on a system that:
- Does not understand the full context of their situation
- Does not create an attorney-client relationship
- Does not owe duties of confidentiality or loyalty
- Cannot appear in court or stand behind its guidance
Information shared with an AI chatbot is not protected by attorney-client privilege. And if the tool misunderstands key facts or provides incorrect guidance, the responsibility falls entirely on the user—not on the software developer.
AI Frequently Gets the Law Wrong
Kruse v. Karlen resonated nationwide because the problem it exposed is common. Generative AI tools often:
- Invent case law
- Blend rules from different states
- Provide incorrect deadlines
- Misstate procedural requirements
- Use language that undermines credibility with the court
These systems generate text that sounds convincing, not text that has been independently verified. When the stakes involve family, finances, or personal liberty, sounding right is not enough.
Oklahoma Courts Expect Accuracy From Everyone
Whether a party is represented by counsel or proceeding on their own, Oklahoma courts require strict compliance with procedural and substantive rules. Judges have begun issuing formal guidance on the use of AI in legal filings.
The U.S. District Court for the Eastern District of Oklahoma, for example, has issued a standing order addressing generative AI. Among other things, it requires:
- Disclosure: Any filing prepared with the assistance of generative AI must identify the tool used.
- Verification: The filer must confirm the accuracy of all AI-generated content, including citations.
- Responsibility: The individual submitting the document is fully accountable for its contents and may face sanctions for violations.
If AI leads to a filing that is inaccurate, incomplete, or misleading—whether intentionally or not—the court will hold the filer responsible. As Kruse and Mattox show, those consequences can be immediate and costly.
Legal Problems Require Judgment, Not Just Information
While AI can summarize general concepts, it cannot provide the judgment or strategic thinking real cases demand.
Legal matters often involve:
- Protecting children and parental rights
- Making significant business decisions
- Resolving property and financial disputes
- Responding to criminal allegations
- Navigating emotionally charged family situations
These are not situations where generic answers or automated responses should determine the outcome. They require careful evaluation and informed decision-making based on how Oklahoma courts actually function.
AI Is a Tool for Lawyers, Not a Replacement
At Ball Morse Lowe, technology is used to improve efficiency—but never at the expense of accuracy or professional responsibility. AI can assist with organization, research support, and other routine tasks, but every pleading, strategy, and recommendation is reviewed and tailored by a licensed attorney.
Used appropriately, technology can reduce unnecessary costs and streamline work. Used incorrectly, it can derail a case before it ever gets off the ground.
If It Matters, Get Real Legal Counsel
The lessons from Kruse and Mattox are clear: shortcuts in legal matters are often expensive ones. AI is a powerful tool, but it cannot guide someone through the Oklahoma legal system or shield them from the consequences of incorrect information.
At Ball Morse Lowe, we help clients understand their options, evaluate risks, and move forward with clarity.
If you need guidance, we are ready to help. Contact us today to speak with a licensed attorney.
Read the full blog: Overruled: Why AI Should Never Be Your Lawyer