CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
Friday, November 7, 2025
No Result
View All Result
CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
No Result
View All Result
CRYPTOREPORTCLUB

AI in the courtroom: The dangers of using ChatGPT in legal practice in South Africa

November 5, 2025
149
0

November 4, 2025

The GIST AI in the courtroom: The dangers of using ChatGPT in legal practice in South Africa

Related Post

AI tech can compress LLM chatbot conversation memory by 3–4 times

AI tech can compress LLM chatbot conversation memory by 3–4 times

November 7, 2025
Magnetic materials discovered by AI could reduce rare earth dependence

Magnetic materials discovered by AI could reduce rare earth dependence

November 7, 2025
Stephanie Baum

scientific editor

Andrew Zinin

lead editor

Editors' notes

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

written by researcher(s)

proofread

law
Credit: Unsplash/CC0 Public Domain

A South African court case made headlines for all the wrong reasons in January 2025. The legal team in Mavundla vs. MEC: Department of Co-Operative Government and Traditional Affairs KwaZulu-Natal and Others had relied on case law that simply didn't exist. It had been generated by ChatGPT, a generative artificial intelligence (AI) chatbot developed by OpenAI.

Only two of the nine case authorities the legal team submitted to the High Court were genuine. The rest were AI-fabricated "hallucinations." The court called this conduct "irresponsible and unprofessional" and referred the matter to the Legal Practice Council, the statutory body that regulates legal practitioners in South Africa, for investigation.

It was not the first time South African courts had encountered such an incident. Parker vs. Forsyth in 2023 also dealt with fake case law produced by ChatGPT. But the judge was more forgiving in that instance, finding no intent to mislead. The Mavundla ruling marks a turning point: courts are losing patience with legal practitioners who use AI irresponsibly.

We are legal academics who have been doing research on the growing use of AI, particularly generative AI, in legal research and education. While these technologies offer powerful tools for enhancing efficiency and productivity, they also present serious risks when used irresponsibly.

Aspiring legal practitioners who misuse AI tools without proper guidance or ethical grounding risk severe professional consequences, even before their careers begin. Law schools should equip students with the skills and judgment to use AI tools responsibly. But most institutions remain unprepared for the pace at which AI is being adopted.

Very few universities have formal policies or training on AI. Students are left with no guide through this rapidly evolving terrain. Our work calls for a proactive and structured approach to AI education in law schools.

When technology becomes a liability

The advocate in the Mavundla case admitted she had not verified the citations and relied instead on research done by a junior colleague. That colleague, a candidate attorney, claimed to have obtained the material from an online research tool. While she denied using ChatGPT, the pattern matched similar global incidents where lawyers unknowingly filed AI-generated judgments.

In the 2024 American case of Park vs. Kim, the attorney cited non-existent case law in her reply brief, which she admitted was generated using ChatGPT. In the 2024 Canadian case of Zhang vs. Chen, the lawyer filed a notice of application containing two non-existent case authorities fabricated by ChatGPT.

The court in Mavundla was unequivocal: No matter how advanced technology becomes, lawyers remain responsible for ensuring that every source they present is accurate. Workload pressure or ignorance of AI's risks is no defense.

The judge also criticized the supervising attorney for failing to check the documents before filing them. The episode underscored a broader ethical principle: Senior lawyers must properly train and supervise junior colleagues.

The lesson here extends far beyond one law firm. Integrity, accuracy and critical thinking are not optional extras in the legal profession. They are core values that must be taught and practiced from the beginning, during legal education.

The classroom is the first courtroom

The Mavundla case should serve as a warning to universities. If experienced legal practitioners can fall into AI traps regarding law, students still learning to research and reason can too.

Generative AI tools like ChatGPT can be powerful allies—they can summarize cases, draft arguments and analyze complex texts in seconds. But they can also confidently fabricate information. Because AI models don't always "know" when they are wrong, they produce text that looks authoritative but may be entirely false.

For students, the dangers are twofold. First, over-reliance on AI can stunt the development of critical research skills. Second, it can lead to serious academic or professional misconduct. A student who submits AI-fabricated content could face disciplinary action at university and reputational damage that follows them into their legal career.

In our paper we argue that instead of banning AI tools outright, law schools should teach students to use them responsibly. This means developing "AI literacy": the ability to question, verify and contextualize AI-generated information. Students should learn to treat AI systems as assistants, not authorities.

In South African legal practice, authority traditionally refers to recognized sources such as legislation, judicial precedent and academic commentary, which lawyers cite to support their arguments. These sources are accessed through established legal databases and law reports, a process that, while time-consuming, ensures accuracy, accountability and adherence to the rule of law.

From law faculties to courtrooms

Legal educators can embed AI literacy into existing courses on research methodology, professional ethics and legal writing. Exercises could include verifying AI-generated summaries against real judgments or analyzing the ethical implications of relying on machine-produced arguments.

Teaching responsible AI use is not simply about avoiding embarrassment in court. It is about protecting the integrity of the justice system itself. As seen in Mavundla, one candidate attorney's uncritical use of AI led to professional investigation, public scrutiny and reputational damage to the firm.

The financial risks are also real. Courts can order lawyers to pay costs out of their pockets when serious professional misconduct occurs. In the digital era, where court judgments and media reports spread instantly online, a lawyer's reputation can collapse overnight if they are found to have relied on fake or unverified AI material. It would also be beneficial for courts to be trained in detecting fake cases generated by AI.

The way forward

Our study concludes that AI is here to stay, and so is its use in law. The challenge is not whether the legal profession should use AI, but how. Law schools have a critical opportunity, and an ethical duty, to prepare future practitioners for a world where technology and human judgment must work side by side.

Speed and convenience can never replace accuracy and integrity. As AI becomes a routine part of legal research, tomorrow's lawyers must be trained not just to prompt—but to think.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Citation: AI in the courtroom: The dangers of using ChatGPT in legal practice in South Africa (2025, November 4) retrieved 4 November 2025 from https://techxplore.com/news/2025-11-ai-courtroom-dangers-chatgpt-legal.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

'Hallucinated' cases are affecting lawyers' careers. They need to be trained to use AI

Feedback to editors

Share212Tweet133ShareShare27ShareSend

Related Posts

AI tech can compress LLM chatbot conversation memory by 3–4 times
AI

AI tech can compress LLM chatbot conversation memory by 3–4 times

November 7, 2025
0

November 7, 2025 The GIST AI tech can compress LLM chatbot conversation memory by 3–4 times Gaby Clark scientific editor Robert Egan associate editor Editors' notes This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:...

Read moreDetails
Magnetic materials discovered by AI could reduce rare earth dependence

Magnetic materials discovered by AI could reduce rare earth dependence

November 7, 2025
Zuckerbergs put AI at heart of pledge to cure diseases

Zuckerbergs put AI at heart of pledge to cure diseases

November 7, 2025
OpenAI boss calls on governments to build AI infrastructure

OpenAI boss calls on governments to build AI infrastructure

November 7, 2025
Universal Music went from suing an AI company to partnering with it. What will it mean for artists?

Universal Music went from suing an AI company to partnering with it. What will it mean for artists?

November 7, 2025
‘Vibe coding’ named word of the year by Collins dictionary

‘Vibe coding’ named word of the year by Collins dictionary

November 7, 2025
Design principles for more reliable and trustworthy AI artists

Design principles for more reliable and trustworthy AI artists

November 7, 2025

Recent News

AI tech can compress LLM chatbot conversation memory by 3–4 times

AI tech can compress LLM chatbot conversation memory by 3–4 times

November 7, 2025

Ripple President Monica Long Issues Statement Following Rumors

November 7, 2025
Meta says it will invest $600 billion in the US, with AI data centers front and center

Meta says it will invest $600 billion in the US, with AI data centers front and center

November 7, 2025
Magnetic materials discovered by AI could reduce rare earth dependence

Magnetic materials discovered by AI could reduce rare earth dependence

November 7, 2025

TOP News

  • Russia Booted From FIFA and UEFA Soccer Events, Including World Cup

    570 shares
    Share 228 Tweet 143
  • Elections 2024: How AI will fool voters if we don’t do something now

    559 shares
    Share 224 Tweet 140
  • The US government is no longer briefing Meta about foreign influence campaigns

    556 shares
    Share 222 Tweet 139
  • Logitech’s Litra Glow streamer light falls to a new low of $40

    555 shares
    Share 222 Tweet 139
  • Meta, X, TikTok, Snap and Discord CEOs will testify before the Senate over online child safety

    617 shares
    Share 247 Tweet 154
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms of Use
Advertising: digestmediaholding@gmail.com

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Crypto news
  • AI
  • Technologies

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved