Another cautionary tale on the perils of uncritical reliance on generative artificial intelligence (AI) arrived in the English High Court in April 2025. The court found a barrister and a firm of solicitors responsible for including fictitious case citations in formal submissions before the court. The court’s response was uncompromising: counsel and her instructing solicitors were found to have acted improperly and unreasonably, were held jointly liable for wasted costs of £4 000, reduced their fees substantially and are being reported to their regulators for what the court labelled “appalling professional misbehaviour”.
The barrister had cited five wholly non-existent cases in written submissions, each citation accompanied by summaries of how the alleged judgments purportedly supported the claimant’s position. This issue was brought to light when the opposing legal team, unsuccessfully, tried to verify these authorities. The court was scathing about the subsequent responses from the claimant’s legal team, which initially downplayed the fictitious citations as mere “minor citation errors” or “cosmetic errors”, a position the court described as “grossly unprofessional.”
When questioned, the barrister involved failed to offer a plausible explanation, instead suggesting these fake cases had somehow been mistakenly pulled from her personal archive. The court firmly rejected this explanation, noting it was impossible to have physical copies or references of judgments that never existed. Although the court could not definitively establish AI use due to the absence of sworn testimony, the court strongly suspected that the cases had been irresponsibly sourced from an AI tool without verification.
The court found that inserting imaginary precedents and then stonewalling when challenged was ample proof of misconduct. The behaviour misled the court, wasted the opponent’s time and money, and undermined public confidence in the profession. In language likely to reappear in disciplinary proceedings, the court said “This sort of behaviour should not be left unexposed. It undermines the integrity of the legal profession and the Bar.”
The court found that this conduct amounted to improper, unreasonable, and negligent behaviour, warranting a wasted costs order against the barrister and her instructing solicitors. The judgment emphasised the obligation of lawyers to uphold standards of honesty and diligence, clearly indicating that reliance on AI-generated content without proper oversight and validation is entirely unacceptable and professionally negligent.
The judgment must be understood in the context of the various guidelines already issued in the English legal profession. The Solicitors Regulation Authority’s 2023 warning notice on the use of AI, the Law Society’s August 2023 practice note and the Bar Council’s October 2023 guidance, all press the same point: practitioners must verify any AI output before deploying it in court. This is the first reported English decision to convert that guidance into a costs penalty.
The English courts join global counterparts in signalling zero tolerance for hallucinated citations. The Southern District of New York imposed sanctions on lawyers in Mata v Avianca, Inc. after they relied on ChatGPT‑fabricated authorities. At home, the Pietermaritzburg High Court struck out seven hallucinated authorities, ordered the attorneys to bear costs personally and referred them to the Legal Practice Council for possible misconduct in Mavundla v MEC: Co‑operative Government and Traditional Affairs
This is unlikely to be the last case in which a busy practitioner, lulled by an AI tool’s fluent prose, overlooks its propensity to invent. But it is clear no practitioner can claim ignorance of the danger. Generative AI may accelerate legal drafting, but it does not dilute professional obligations. As the various judgments have made plain, the price for forgetting that lesson is paid not by the algorithm but by the lawyer – in costs, in reputation and, potentially, in disciplinary sanction.
Ayinde v London Borough of Haringey ([2025] EWHC 1040 (Admin)