R (On the Application of Ayinde) v Haringey [2025]
Decision Number: EWHC 1383 Legal Body: High Court of England & Wales
Published on: 03/07/2025
Issues Covered:
Article Authors The main content of this article was provided by the following authors.
Jason Elliott BL Barrister & Lecturer of Law, Ulster University
Jason Elliott BL Barrister & Lecturer of Law, Ulster University
Jason elliott new
LinkedIn

Jason Elliott was called to the Bar of Northern Ireland in 2013 and is the Associate Head of School of Law at Ulster University.  As a practising barrister, he has developed a largely civil practice representing individuals, companies and public bodies in litigation. This covers a wide range of areas including personal injuries, wills and employment law. In terms of employment law, he has represented both applicants and respondents in the Industrial Tribunal.   At Ulster University, Jason lectures extensively on the civil areas of practise such as Equity and Trusts and delivers employment law lectures for both undergraduate and postgraduate students.

Claimants:
R (On the Application of Ayinde) and Hamad Al-Haroun
Defendants:
Haringey London Borough Council, Qatar National Bank QPSC and QNB Capital LLC Defendants
Summary

Wasted costs and referral to Regulator occurs when there was a failure to check responses from a generative AI tool leading to fictitious cases being put before the court.

Background

The basis of the decision of the High Court related to the use or suspected use of generative Ai to produce documents placed before the court. In the case, the Law Centre instructed a junior barrister. In the pleadings placed before the court there were five cases which did not exist. The Housing Authority applied citing that they were unable to fine the cases and an intention to apply for wasted costs. The junior barrister, and approved by the solicitor from the Law Centre, saying they were ‘cosmetic’ errors and could be corrected. It was for the court to determine whether there should be wasted costs against the legal representatives for the claimant.

Outcome

The High Court outlined that freely available generative AI tools, such as ChatGPT, were not capable of conducting reliable legal research. Those who did use such tools had a professional responsibility to check its accuracy and by cross-referencing with the sources that it was citing.  The risks could not be materially reduced by asking the tool to only provide an answer if it was sure. 

The High Court noted that there had to be practical and effective measures taken by those within the legal profession with leadership responsibilities such as heads of chambers and managing partners. This would be to ensure that there was compliance with professional and ethical responsibilities.  Where there was a failure by a legal representative to comply with their duties to the court the court had powers to publicly admonish the lawyer, make a costs order, make a waster costs order, strike out a case, referral to a regulator, contempt proceedings, and a referral to the police.   

In making a decision the court’s primary concern was to ensure that lawyers clearly understood the consequences of using AI for legal research without checking it.  The lawyer involved had been publicly criticised in the judgment and was being referred to the regulator.  The court did acknowledge some difficulties such as the level of work that had been given and other home/work issues. There was also £2,000 of wasted costs levied against the junior council and the Law Centre to be payable to the defendant.

Practical Guidance

Generative AI tools have the capability of ensuring efficiencies and producing mass amounts of material within seconds. However, caution must be taken in the use of such material especially when it comes to the professional and ethical concerns as can arise in terms of interactions with the court. This judgment gives a stark lesson not only in monetary terms such as wasted costs but, perhaps more importantly, the reputational damage that can arise when the results from AI tools are not thoroughly checked and relied upon blindly.  Both HR professionals and legal professions must heed this warning and ensure that where AI tools are being utilised that sufficient oversight of the responses is had to avoid hallucinations and misleading others or the court.

You can read the case in full here.

Continue reading

We help hundreds of people like you understand how the latest changes in employment law impact your business.

Already a subscriber?

Please log in to view the full article.

What you'll get:

  • Help understand the ramifications of each important case from NI, GB and Europe
  • Ensure your organisation's policies and procedures are fully compliant with NI law
  • 24/7 access to all the content in the Legal Island Vault for research case law and HR issues
  • Receive free preliminary advice on workplace issues from the employment team

Already a subscriber? Log in now or start a free trial

Disclaimer The information in this article is provided as part of Legal Island's Employment Law Hub. We regret we are not able to respond to requests for specific legal or HR queries and recommend that professional advice is obtained before relying on information supplied anywhere within this article. This article is correct at 03/07/2025