CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
Thursday, October 30, 2025
No Result
View All Result
CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
No Result
View All Result
CRYPTOREPORTCLUB

‘Hallucinated’ cases are affecting lawyers’ careers. They need to be trained to use AI

October 29, 2025
151
0

October 29, 2025

The GIST 'Hallucinated' cases are affecting lawyers' careers. They need to be trained to use AI

Related Post

AI efficiency advances with spintronic memory chip that combines storage and processing

AI efficiency advances with spintronic memory chip that combines storage and processing

October 30, 2025
Startup Character.AI to ban direct chat for minors after teen suicide

Startup Character.AI to ban direct chat for minors after teen suicide

October 30, 2025
Lisa Lock

scientific editor

Andrew Zinin

lead editor

Editors' notes

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

written by researcher(s)

proofread

AI in the courtroom
Credit: AI-generated image

Generative artificial intelligence, which produces original content by drawing on large existing datasets, has been hailed as a revolutionary tool for lawyers. From drafting contracts to summarizing case law, generative AI tools such as ChatGPT and Lexis+ AI promise speed and efficiency.

But the English courts are now seeing a darker side of generative AI. This includes fabricated cases, invented quotations, and misleading citations entering court documents.

As someone who studies how technology and the law interact, I argue it is vital that lawyers are taught how, and how not, to use generative AI. Lawyers need to be able to avoid the risk of sanctions for breaking the rules, but also the development of a legal system that risks deciding questions of justice based on fabricated case law.

On 6 June 2025, the high court handed down a landmark judgment on two separate cases: Frederick Ayinde v The London Borough of Haringey and Hamad Al-Haroun v Qatar National Bank QPSC and QNB Capital LLC.

The court reprimanded a pupil barrister (a trainee) and a solicitor after their submissions contained fictitious and inaccurate case law. The judges were clear: "freely available generative artificial intelligence tools… are not capable of conducting reliable legal research."

As such, the use of unverified AI output can no longer be excused as error or oversight. Lawyers, junior or senior, are fully responsible for what they put before the court.

Hallucinated case law

AI "hallucinations"—the confident generation of non-existent or misattributed information—are well documented. Legal cases are no exception. Research has recently found that hallucination rates range from 58% to 88% in response to specific legal queries, often on precisely the sorts of issues lawyers are asked to resolve.

These errors have now leapt off the screen and into real legal proceedings. In Ayinde, the trainee barrister cited a case that did not exist at all. The erroneous example had been misattributed to a genuine case number from a completely different matter.

In Al-Haroun, a solicitor listed 45 cases provided by his client. Of these, 18 were fictitious and many others irrelevant. The judicial assistant is quoted in the judgment as saying: "The vast majority of the authorities are made up or misunderstood."

These incidents highlight a profession facing a perfect storm: overstretched practitioners, increasingly powerful but unreliable AI tools, and courts no longer willing to treat errors as mishaps. For the junior legal profession, the consequences are stark.

Many are experimenting with AI out of necessity or curiosity. Without the training to spot hallucinations, though, new lawyers risk reputational damage before their careers have fully begun.

The high court took a disciplinary approach, placing responsibility squarely on the individual and their supervisors. This raises a pressing question. Are junior lawyers being punished too harshly for what is, at least in part, a training and supervision gap?

Education as prevention

Law schools have long taught research methods, ethics, and citation practice. What is new is the need to frame those same skills around generative AI.

While many law schools and universities are either exploring AI within their modules or creating new modules that look at AI, there is a broader shift towards considering how AI is changing the legal sector as a whole.

Students must learn why AI produces hallucinations, how to design prompts responsibly, how to verify outputs against authoritative databases and when using such tools may be inappropriate.

The high court's insistence on responsibility is justified. The integrity of justice depends on accurate citations and honest advocacy. But the solution cannot rest on sanction alone.

If AI is part of legal practice, then AI training and literacy must be part of legal training. Regulators, professional bodies and universities share a collective duty to ensure that junior lawyers are not left to learn through error or in the most unforgiving of environments, the courtroom.

Similar issues have arisen from non-legal professionals. In a Manchester civil case, a litigant in person admitted relying on ChatGPT to generate legal authorities in support of their argument. The individual returned to court with four citations, one entirely fabricated and three with genuine case names but with fictitious quotations attributed to them.

While the submissions appeared legitimate, closer inspection by opposing counsel revealed the paragraphs did not exist. The judge accepted the litigant had been inadvertently misled by the AI tool and imposed no penalty. This shows both the risks of unverified AI-generated content entering proceedings and the challenges for unrepresented parties in navigating court processes.

The message from Ayinde and Al-Haroun is simple but profound: using GenAI does not reduce a lawyer's professional duty, it heightens it. For junior lawyers, that duty will arrive on day one. The challenge for legal educators is to prepare students for this reality, embedding AI verification, transparency, and ethical reasoning into the curriculum.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Citation: 'Hallucinated' cases are affecting lawyers' careers. They need to be trained to use AI (2025, October 29) retrieved 29 October 2025 from https://techxplore.com/news/2025-10-hallucinated-cases-affecting-lawyers-careers.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

More people are using AI in court, not a lawyer. It could cost you money—and your case

Feedback to editors

Share212Tweet133ShareShare27ShareSend

Related Posts

AI efficiency advances with spintronic memory chip that combines storage and processing
AI

AI efficiency advances with spintronic memory chip that combines storage and processing

October 30, 2025
0

October 29, 2025 feature The GIST AI efficiency advances with spintronic memory chip that combines storage and processing Ingrid Fadelli contributing writer Lisa Lock scientific editor Robert Egan associate editor Editors' notes This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following...

Read moreDetails
Startup Character.AI to ban direct chat for minors after teen suicide

Startup Character.AI to ban direct chat for minors after teen suicide

October 30, 2025
The great search divide: How AI and traditional web searches differ

The great search divide: How AI and traditional web searches differ

October 29, 2025
Artificial neurons replicate biological function for improved computer chips

Artificial neurons replicate biological function for improved computer chips

October 29, 2025
Neuromorphic computer prototype learns patterns with fewer computations than traditional AI

Neuromorphic computer prototype learns patterns with fewer computations than traditional AI

October 29, 2025
Generative AI may help turn consumers into active collaborators and creators, study finds

Generative AI may help turn consumers into active collaborators and creators, study finds

October 29, 2025
Australian police design AI tool to decipher predators’ Gen Z slang

Australian police design AI tool to decipher predators’ Gen Z slang

October 29, 2025

Recent News

AI efficiency advances with spintronic memory chip that combines storage and processing

AI efficiency advances with spintronic memory chip that combines storage and processing

October 30, 2025

‘And Then You Win’ Book Tells The Secrets Of Bitfury’s Bitcoin Empire

October 30, 2025
Rode’s latest wireless microphones now work with digital cameras

Rode’s latest wireless microphones now work with digital cameras

October 30, 2025
Startup Character.AI to ban direct chat for minors after teen suicide

Startup Character.AI to ban direct chat for minors after teen suicide

October 30, 2025

TOP News

  • After OpenAI’s new ‘buy it in ChatGPT’ trial, how soon will AI be online shopping for us?

    After OpenAI’s new ‘buy it in ChatGPT’ trial, how soon will AI be online shopping for us?

    614 shares
    Share 246 Tweet 154
  • XRP Price Gains Traction — Buyers Pile In Ahead Of Key Technical Breakout

    567 shares
    Share 227 Tweet 142
  • Discord launches a virtual currency

    569 shares
    Share 228 Tweet 142
  • Apple is reportedly getting ready to introduce ads to its Maps app

    536 shares
    Share 214 Tweet 134
  • Relive the Commodore 64’s glory days with a slimmer, blacked-out remake

    535 shares
    Share 214 Tweet 134
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms of Use
Advertising: digestmediaholding@gmail.com

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Crypto news
  • AI
  • Technologies

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved