07722 576519 zak@fullbrooklaw.co.uk
Back to Blog
Legal Updates 19 April 2026 7 min read

AI in the Courtroom: What the Recent Cases Mean for Civil Litigants

Courts in England and Wales are now actively sanctioning lawyers and litigants who rely on AI-generated legal research without verification. Here is what you need to know.

AI in the Courtroom: What the Recent Cases Mean for Civil Litigants

Artificial intelligence tools are now widely accessible and increasingly used to draft correspondence, research legal issues, and prepare court documents. In recent months, a series of decisions in the courts of England and Wales has made clear that relying on AI-generated legal research without independent verification is not merely careless. It can result in wasted costs orders, regulatory referrals, and in the most serious cases, criminal liability. For anyone involved in civil litigation, whether as a party, a solicitor, or a litigant in person, these developments demand attention.

The leading case: Ayinde v Haringey

The decision that brought this issue firmly into English law is R (on the application of Ayinde) v Haringey LBC [2025] EWHC 1383 (Admin). In that case, counsel included fabricated case citations in written submissions to the High Court. The authorities cited did not exist. They had been generated by an AI tool and had not been checked against any legal database or primary source before being put before the court.

The court's response was unequivocal. A wasted costs order was made against both the claimant's solicitors and counsel, each being ordered to pay £2,000 to the defendant. More significantly, the court referred the lawyers involved to their respective regulatory bodies. The judgment made clear that a legal professional's duty to the court includes verifying every authority cited in submissions, regardless of how that authority was originally sourced.

The pattern continues: Al-Haroun and Elden

Ayinde was not an isolated incident. In the Al-Haroun case, a solicitor submitted a witness statement containing eighteen authorities that were found not to exist. The solicitor had relied on research undertaken by his client and had not independently verified any of it. The fact that the solicitor had not personally used AI was no defence. The obligation to check the accuracy of legal authorities before presenting them to the court falls on the legal professional who signs the document or puts it before the judge.

More recently, in Elden v Revenue and Customs Commissioners [2026] UKFTT 41 (TC), the First-tier Tribunal addressed inaccuracies in case summaries within a skeleton argument. The summaries had been produced using AI and contained errors of fact and citation. The Tribunal responded by imposing specific requirements for future submissions: the party was directed to annex full copies of the judgments relied upon, to use direct quotations rather than paraphrased summaries, and to include a statement of truth confirming that all facts and case references had been independently verified.

Litigants in person are not exempt

The Tribunal in Zzaman v HMRC [2025] UKFTT 00539 (TC) dealt with a litigant in person who had used AI to prepare a written statement of case. The document contained inaccurate citations and unsupported propositions. While the Tribunal was sympathetic to the position of an unrepresented party, it did not excuse the reliance on unverified AI output. The Tribunal offered practical suggestions for litigants using AI tools: instruct the tool to decline to answer where it is uncertain, request specific paragraph references for any cited case, and ask the tool to identify weaknesses in the arguments it generates.

This is an important point for the many individuals and small businesses who pursue or defend civil claims without legal representation. AI tools can be helpful for organising thoughts or drafting correspondence, but using them to generate legal authorities carries real risk. A fabricated case citation in a skeleton argument or witness statement can undermine credibility entirely and may result in adverse costs consequences.

The CJC consultation: what comes next

In February 2026, the Civil Justice Council published a consultation paper considering whether formal rules should govern the use of AI in court documents. The consultation, which closed on 14 April 2026, proposed a differentiated approach depending on the type of document involved.

For statements of case and skeleton arguments, the CJC's provisional view was that no new rules are needed, provided these documents continue to bear the name of the responsible lawyer. The logic is that existing professional duties already require a lawyer to verify the contents of any document they sign or present to the court. For witness statements, however, the CJC proposed that a declaration should be required confirming that AI was not used to generate the factual content of the statement. For expert reports, the proposed approach would require the statement of truth to identify and explain any use of AI beyond purely administrative tasks such as formatting or spell-checking.

Whatever the outcome of the consultation, the direction of travel is clear. The courts expect transparency about AI use and will hold practitioners accountable for the accuracy of AI-generated material.

What this means for civil disputes

The practical implications extend well beyond courtroom advocacy. In pre-action correspondence, letters before claim, and responses to statutory demands, the accuracy of legal citations matters. A letter of claim that relies on a fabricated authority is not only ineffective but may expose the sender to costs sanctions if the matter proceeds to litigation and the opposing party or the court identifies the error.

For solicitors and legal advisers, the message from these cases is straightforward: every authority must be verified against a primary source. AI tools can assist with drafting and research, but they cannot replace the professional judgment required to confirm that a case exists, that it says what it is claimed to say, and that it is relevant to the argument being advanced. For litigants in person, the safest approach is to treat any AI-generated legal reference as a starting point that must be checked, not a finished product ready for submission.

If you are involved in a civil dispute and are unsure about the strength of your legal position, obtaining proper legal advice remains the most reliable way to avoid procedural difficulties and to ensure that any correspondence or court documents are accurate and properly supported.

Share this article:

Disclaimer: This article is for general informational purposes only and does not constitute legal advice. The content should not be relied upon as a substitute for specific legal advice relevant to your situation. If you require legal assistance, please contact us for a confidential discussion.

Need Advice on a Civil Dispute?

Get in touch for a confidential discussion about your situation.

Contact Us Today

Important regulatory information: Fullbrook Law Limited is not authorised or regulated by the Solicitors Regulation Authority. Zachary Taylor is individually regulated by the SRA as a solicitor advocate (SRA no. 7013058). Fullbrook Law does not conduct litigation or represent clients in court proceedings. For full details of your regulatory protections, please see our regulatory information and complaints procedure.