Authenticity of disclosed documents and AI
Tuesday 14 April 2026
David Hopkins
39 Essex Chambers, London
david.hopkins@39essex.com
Santosh Carvalho
39 Essex Chambers, London
santosh.carvalho@39essex.com
Generative Artificial Intelligence ('GenAI') poses potentially significant challenges to the processes of disclosure of documents and other evidence in common law adversarial litigation.
Forgeries and other fraudulent documents are hardly new. But with the now widespread availability of high quality large language model, text-to-audio, text-to-image, and text-to-video tools, the barriers to a malicious actor creating a plausible, but inauthentic, document have never been lower. A party to litigation acting in bad faith might disclose or otherwise share such a document with opposing parties to either (a) rely on it at a hearing; or (b) sow doubt and confusion, even if not eventually relied upon at a hearing.
In the face of these risks, it is incumbent on practitioners to ask whether the existing procedural guardrails against forged documents are fit for purpose. We identify potential scenarios for bad actors to use documents forged with the assistance of GenAI, the existing procedural guardrails in three common law jurisdictions and their limitations, and finally some possible responses to the challenge.
Litigation scenarios for the use, by bad actors, of documents forged with the assistance of GenAI
The significant reductions in the costs and difficulty of forging documents made possible by GenAI means the scenarios for their use by bad actors potentially cover all types of litigation from relatively minor claims through to High Court blockbusters:
- A red car and a luxury blue car collide at a roundabout. The driver of the red car ('R') was at fault. No one is injured, but around £10,000’s worth of damage is caused to the blue car. At the scene of the accident, the drivers courteously exchange details, with neither admitting liability. The driver of the blue car ('B') later brings a claim against R, such claims being a staple of the lower courts in several common law jurisdictions. During disclosure, R discloses a purported dashcam video (in fact created using OpenAI’s Sora) showing the blue car swerving violently into the red car. B’s insurer insists they settle the claim, despite B’s protestations the video does not show what really happened.
- In a long-running shareholders’ dispute, a party ('X') discloses purported board minutes from 1998, apparently written using Word 97 (possibly assisted by Clippy). The minutes record a crucial decision. The potential relevance of these minutes is not appreciated by opposing parties until X refers to them extensively in their witness statement. Other parties to the dispute suspect, but struggle to prove, the minutes were generated using a large language model with sophisticated prompting. The minutes use period-appropriate language and reference real people, some of whom are since deceased. The company's records from that period are alleged to have been mostly lost in a 2003 office move. No backup server exists, nor any other rich trail of historical metadata, to substantiate whether the minutes were created or circulated at the time. (See Crypto Open Patent Alliance v Wright [2024] EWHC 1198 (Ch)1 for a recent example of how a determined malicious actor can attempt to abuse the court’s process with fabricated documents that purport to record historical facts.)
The procedural guardrails – a notice to prove the authenticity of a document
Disclosure (aka discovery) – the right to obtain documentary evidence from one’s opponent, even where it undermines the opponent’s case – is a key feature of litigation in the common law tradition.
- In England and Wales, rule 32.19 of the Civil Procedure Rules 1998 ('CPR') provides:2
'(1) A party shall be deemed to admit the authenticity of a document disclosed to him […] unless he serves notice that he wishes the document to be proved at trial.
(2) A notice to prove a document must be served—(a) by the latest date for serving witness statements; or (b) within 7 days of disclosure of the document, whichever is later.'
- In Australia, rule 22.05 of the Federal Court Rules 2011 provides:3
'A party (the first party) will be taken to have admitted the authenticity of any document specified in another party’s list of documents for which inspection has been permitted unless:
(a) the authenticity has been denied in the first party’s pleadings or affidavits; or
(b) the first party has given the other party notice within 14 days after inspection was permitted that the authenticity of the document is denied.
Subject to the court's power to dispense with compliance with this rule at any time – r1.34.'
- Finally, Order 12 rule 10 of the Singapore International Commercial Court Rules 20214 is materially similar to the Australian rule above, save that, under the second limb, the receiving party has 28 days to serve a notice of non-admission.
These rules provide an opportunity to shift the burden of proving authenticity onto the party relying on the document. Provided a notice is served in good time, that party must adduce credible evidence of the document’s authenticity. In the case, say, of a hard copy diary, such evidence would be adduced as witness evidence from the diary’s author and/or handwriting expert evidence. If the party challenging the document fails to adduce sufficient rebuttal evidence then, unless the evidence of authenticity is completely incredible, the court will find authenticity proven.
Limitations of the existing guardrails
The rules outlined above effectively embody a presumption of authenticity. The bases for this presumption seem to be:
- Generally, parties do not knowingly seek to rely on or disclose fabricated documents.
- Creating a convincing forgery has a high cost, in terms of time, skill, and, if a professional forger is involved, money.
- Knowingly relying on or disclosing a forgery in litigation has a high, albeit contingent, cost. In the event that the court determines the document is a forgery it is highly likely to treat other evidence adduced by the party relying on the forgery with greater scepticism. It may well impose sanctions, such as adverse costs orders.
But, in 2026, everyone with a smartphone has access to the tools to create forgeries which are indistinguishable from genuine documents to the untrained eye. Creation takes minutes or hours. The risk of detection may also have reduced. The 'price' of creating a convincing forgery has been cut significantly. In equilibrium we should therefore expect a significant increase in 'demand' for forgeries.
The presumption also favours more sophisticated (in monetary and/or technological terms) parties. In large-scale litigation, the inauthentic document (or a handful of them) may be strategically buried within tens or hundreds of thousands of documents, putting scrutiny of it beyond the reach of all but the most diligent parties until, perhaps, it is too late.
The way forward
Recognising that the 'benefits [of AI] do not come without significant risk', the Civil Justice Council, an advisory body in the UK justice system, is carrying out a consultation to consider whether new procedural rules are needed to govern the use of AI by legal representatives for the preparation of court documents5. But there are not, as yet it seems, any consultations or proposals (at least in our jurisdiction) to deal with the risk of the use of AI and GenAI in respect of contemporaneous evidentiary documents.
It seems to us few common law litigators in jurisdictions with the presumption of authenticity, or their clients, would welcome its removal. Requiring a party to prove the authenticity of each document it discloses and/or relies upon would be wholly disproportionate in nearly all cases: litigation of even modest scale can require the disclosure of thousands of documents. It would also inevitably favour the well-resourced party.
In our jurisdiction, the CPR require parties and their representatives to make certifications to ensure the integrity of the disclosure process. A disclosure statement under the CPR requires a party to certify the extent of the search they have conducted and their understanding of their obligation to disclose documents. But it does not require parties to certify the documents they are disclosing are untainted by GenAI. If the potential risks we have outlined above start to be realised, it may be that the time comes to require the parties and their legal representatives to certify, at pain of being in contempt of court if untrue, whether any of their disclosed documents were created or edited in any way by AI. This would, at least, require parties to put their minds to the consequences of disclosing such documents.
Notes
1 https://caselaw.nationalarchives.gov.uk/ewhc/ch/2024/1198, accessed 13 March 2026.
2 https://www.justice.gov.uk/courts/procedure-rules/civil/rules/part32#32.19, accessed 12 March 2026.
3 https://www.austlii.edu.au/cgi-bin/viewdoc/au/legis/cth/consol_reg/fcr2011186/s22.05.html, accessed 12 March 2026.
4 https://sso.agc.gov.sg/SL/SCJA1969-S924-2021?ProvIds=PO12-#PO12-pr10-, accessed 12 March 2026.
5 https://www.judiciary.uk/wp-content/uploads/2026/02/Interim-Report-and-Consultation-Use-of-AI-for-Preparing-Court-Documents-2.pdf, accessed 13 March 2026.