The Dangers of ChatGPT

As artificial intelligence (AI) tools become increasingly accessible and sophisticated, more individuals are turning to them to assist in drafting legal documents, especially pro se litigants, those who represent themselves in court without an attorney. These tools, including AI-based chatbots and document generators, promise affordability, speed, and simplicity in navigating the complex world of litigation. However, the intersection of AI-generated legal documents and civil procedure presents serious risks. Chief among these are procedural missteps, legal errors, and even sanctions under Rule 11 of the North Carolina Rules of Civil Procedure.

AI offers pro se litigants a valuable tool for bridging the "justice gap” - the divide between individuals who need legal help and those who can afford it. AI tools can assist in drafting motions, complaints, discovery requests, and even settlement agreements. These technologies can improve access to justice, helping litigants find relevant laws, format documents correctly, and understand basic legal principles.

For many, particularly in civil matters like small claims, landlord-tenant disputes, and family law, AI feels like a lifeline. In North Carolina, where legal aid resources are stretched thin, the affordability and availability of AI tools can seem empowering.

However, AI is not a substitute for legal education or training. Most AI tools do not teach civil procedure or the nuanced application of substantive law. A pro se litigant may draft a document that looks correct but fails to meet procedural requirements - such as proper service, filing deadlines, jurisdictional rules, or formatting. AI may also generate content that is inaccurate, outdated, or even fictitious. Without a foundational understanding of the rules of civil procedure, pro se litigants risk having their filings dismissed or ignored, delaying justice and potentially prejudicing their cases. For instance, a pro se litigant in North Carolina drafting a motion to compel discovery using an AI tool may unknowingly skip a required meet-and-confer step under Rule 37. The court could deny the motion outright for failing to follow required procedures - even if the underlying complaint has merit.

Perhaps the most serious consequence of improper reliance on AI is the risk of Rule 11 sanctions. Rule 11 of the North Carolina Rules of Civil Procedure requires that every pleading, motion, or other paper submitted to the court is signed and certifies that the filing: • Is not presented for an improper purpose; • Is warranted by existing law or a good-faith argument for extending it; • Has evidentiary support (or is likely to have support after discovery); and • Has been reasonably investigated under the circumstances. If a pro se litigant files a motion or complaint based on inaccurate AI-generated legal authority - such as fake case citations or nonexistent statutes - the court may find that the filing violates Rule 11.

Notably, recent national headlines have showcased attorneys being sanctioned for citing "hallucinated" cases produced by AI tools like ChatGPT. Pro se litigants are not exempt from Rule 11 simply because they lack legal training. Filing frivolous or unfounded legal documents, even unintentionally, can result in sanctions ranging from fines to dismissal of claims, or even a bar from future filings.

Moreover, unlike attorneys who have malpractice insurance and professional reputations at stake, pro se litigants may suffer irreparable harm from Rule 11 sanctions - financially, legally, and reputationally.

The rise of AI in legal practice offers unprecedented opportunities to democratize access to legal information and empower pro se litigants. Yet, this promise comes with serious risks, especially for those unaware of the procedural requirements and legal responsibilities that govern civil litigation. Courts are not forgiving when it comes to Rule 11 violations, even for self-represented parties. Without careful vetting, proper legal understanding, and the humility to recognize AI’s limitations, pro se litigants risk undermining their own cases and facing significant sanctions. As AI tools become more common in courtrooms and clerk’s offices, a parallel push for legal education, transparency in AI outputs, and professional oversight is essential. In the meantime, pro se litigants would be wise to consult a licensed attorney to review AI-generated documents before filing and to familiarize themselves with the rules that govern their actions in court.

Previous
Previous

Understanding the Hearsay Problem

Next
Next

Defense of Schizophrenia