OUT-LAW NEWS 4 min. read
Consultation opens over AI use in English litigation
Photo: iStock
05 Mar 2026, 11:03 am
A new consultation on how artificial intelligence should be used in preparing civil court documents should help produce sensible guardrails for procedures in future, but precision is needed, according to an expert.
The Civil Justice Council (CJC) has published its interim report on the use of AI for preparing court documents, as it invites input into new proposals on the governance of the use of AI tools in this context.
The report from the council - which advises the Lord Chancellor, the judiciary and the Civil Procedure Rules Committee on civil procedure matters - comes as countries around the world introduce their own guidance and restrictions on best AI practice.
It looks at the use of AI in preparing a variety of court documents including statements of case, skeleton arguments and associated documents such as chronologies or case summaries, witness statements and expert reports. It also touches on the role of AI in the context of disclosure.
The most significant proposals made by the CJC include that witness statements prepared for trial in the Business and Property Courts, which hear most large commercial disputes in England and Wales, should be required to include a specific declaration that AI has not been used to generate the statement.
This follows from Practice Direction 57AC of the Civil Procedure Rules, which makes clear that a witness statement must be in a witness’s own words and that, even where a legal representative takes primary responsibility for the drafting of such a statement, its content should be “taken from, and should not go beyond, the content of the record or notes” of the witness’s own version of events.
The CJC’s report expresses the view that it is “difficult to see” how these requirements can be met if AI is used, “other than for non-text generating purposes”, with the proposals meaning legal representatives would be required to declare that AI has not been used for generating the content of such a statement, “including by way of altering, embellishing, strengthening, diluting or rephrasing” it.
In addition, the CJC proposes that the declaration which expert witnesses are required to include in their reports to the court should require experts to explain what use has been made of AI, other than for transcription or other administrative uses, and to identify the AI tools used.
In contrast, the CJC does not recommend additional rules or mandatory declarations for AI use in relation to statements of case or skeleton arguments, if they bear the name of the legal representative who is taking professional responsibility for their contents.
Emilie Jones, a litigation expert at Pinsent Masons, said the consultation was a timely one given how AI was already starting to transform legal practice, including in dispute resolution.
“It is welcome that the CJC recognises this evolution and the opportunities it presents and is not seeking to inhibit growth by overly strict requirements,” she said.
“Lawyers are already subject to professional conduct rules which require them to act appropriately in the use of AI and the preparation of documents which they submit to court. It is sensible, however, to suggest some modest measures in connection with higher risk uses of AI, to ensure transparency, uphold high standards of conduct, and thereby protect the effectiveness, quality and reputation of administration of justice in the English courts.”
She said it would be essential that precision was used in choosing how to codify any finalised rule changes, with a need for flexibility to reflect the diversity and rapid development of AI use.
“The report recognises that there is some debate around what is meant by “AI”, that there are a wide range of different ways in which AI is used in the context of dispute resolution – from “administrative” uses such as for transcription and spell-checking; through research, summarisation and rephrasing; to substantive content generation – and that tools and their use are evolving all the time,” she added.
“The CJC rightly makes clear that its concerns are around more substantive, rather than administrative, uses of AI. It is important that those required to make declarations to the court - which could attract sanctions if inaccurate - can understand precisely what is required of them and are not compelled to make declarations in terms which are too wide, generic, or inflexible, but instead reflect the nuances between different uses of AI.”
The CJC’s report also touches upon the use of AI in connection with disclosure, but does not recommend any rule change in this regard.
Caroline Hearn, a litigation and e-disclosure expert with Pinsent Masons, said that the use of advanced technology in e-disclosure was nothing new, and that parties were already expected to take a careful and transparent approach to disclosure, which would align with the CJC’s direction of travel.
“That said, as the report acknowledges, the use of AI in this area is developing quickly,” she added.
“If used properly, AI has the potential to significantly reduce the time and cost involved in large disclosure exercises, which is clearly attractive for clients. There are, however, areas where further guidance would help parties feel more confident about using these tools.
“One example flagged in the report is whether detailed prompts entered into AI systems for disclosure purposes are themselves disclosable — an issue on which there is currently some debate within the market.
“Clarification, whether through authoritative guidance or a judicial decision, that such prompts will generally be protected by legal professional privilege would be particularly helpful.”
The consultation is limited to civil justice and the report recognises that the use of AI will continue to develop. It does not address a number of related issues, including the use of AI by litigants in person, who may rely on AI tools for research and document preparation. That gap leaves open the possibility that these issues, and any potential need for restrictions, may require further consideration in the future.
The consultation closes on 14 April 2026.