Judge Fines Attorneys $3,000 Each for AI-Generated Legal Filing Errors in Mike Lindell Defamation Case
A federal judge has sanctioned two attorneys representing MyPillow CEO Mike Lindell, fining them $3,000 each for submitting a court motion riddled with errors—including fake case citations—generated by artificial intelligence.

Key Details of the Sanctions
- Judge Nina Y. Wang (U.S. District Court, Denver) ruled that attorneys Christopher Kachouroff and Jennifer DeMaster violated court rules by filing a motion containing:
- Misquoted legal precedents.
- References to nonexistent court cases—a known issue with AI-generated legal research.
- Kachouroff admitted using generative AI to draft the motion but claimed the erroneous version was submitted accidentally.
- The judge found his explanation unconvincing, noting that even his “corrected” version contained errors and inconsistent timestamps.
Judge’s Rebuke: ‘Troubling and Not Well-Taken’
Wang criticized Kachouroff’s attempts to shift blame, including his suggestion that the court tried to “blindside” him over the mistakes. She wrote:
“Neither attorney provided any explanation for how these false citations appeared without either AI use or gross carelessness.”
- The judge concluded the errors were not mere oversights but sanction-worthy negligence.
- While she took “no joy” in penalizing the lawyers, she deemed the fines necessary to deter similar misconduct.
Background: Lindell’s Defamation Case
- The sanctions stem from Lindell’s failed defense in a defamation lawsuit brought by Eric Coomer, a former Dominion Voting Systems executive.
- A jury previously found Lindell liable for false election fraud claims, ordering him and his company to pay $2.3 million in damages.
- Lindell, a prominent 2020 election denier, had accused Coomer of “treason” and involvement in a nonexistent voting conspiracy.
Broader Implications
This case highlights growing judicial scrutiny of AI-assisted legal work, particularly after high-profile blunders like the “ChatGPT lawyer” incident in New York last year. Courts increasingly expect:
✔ Human verification of AI-generated content.
✔ Transparency when AI tools are used.
✔ Accountability for errors, even if technology-assisted.
What’s Next?
- The sanctioned attorneys have not commented on the ruling.
- Lindell, who is not responsible for the fines, continues to face legal battles over his election claims, including a separate $1.3 billion lawsuit from Dominion.
The Takeaway: While AI can streamline legal research, this case serves as a warning that reliance on unchecked automation risks professional and financial consequences.