In this article from the Spring 2025 edition of Expert Matters, Sean Mosby, Policy Manager at the Expert Witness Institute, examines a recent case in which the use of generative AI in expert evidence was critically assessed by the court. The judgment highlights the risks of relying on AI tools like Microsoft Copilot without understanding their limitations and sets out key learning points for both experts and instructing parties.

The case

The petitioner was seeking an order for settlement of the accounts of the estate she had been administering. The objectant, who was a beneficiary of the Trust, claimed the petitioner had breached her fiduciary duty by deciding to retain a property in the Bahamas when selling it would have been a better financial option. He also objected to her travel to the island, which he claimed amounted to a conflict of interest or self-dealing.

The expert evidence

The objectant relied on the expert evidence of Charles Ransom to prove the damages caused by the retention of the property in the Bahamas. Mr Ransom provided two reports. The court found numerous deficiencies in Mr Ransom’s evidence, finding that his “calculations, and specifically those with regards to damages, are inherently unreliable, are based on speculation, hypothetical market performance, and are unsupported or outright contradicted by facts in the record.”

The expert’s use of AI

The court went on to examine Mr Ransom’s use of Microsoft Copilot, a generative AI tool, in cross-checking his calculations. Mr Ransom could not recall the input or prompt he used, could not state which sources Copilot relied upon, and was unable to explain how Copilot works and how it arrives at a given output.

The judge noted that the court did not have an objective understanding of how Copilot works and, when it attempted to replicate Mr Ranom’s results, Copilot provided different outputs. The judge noted that “while these resulting variations are not large, the fact there are variations at all calls into question the reliability and accuracy of Copilot to generate evidence to be relied upon in a court proceeding.”

The judge then queried Copilot about its reliability and accuracy. The tool responded that “my accuracy is only as good as my sources so for critical matters, it’s always wise to verify” and “I do my best to be as reliable as possible. However, I’m also programmed to advise checking with experts for critical issues.” When asked whether its calculations were reliable enough to be used in court, Copilot replied, “when it comes to legal matters, any calculations or data need to meet strict standards. I can provide accurate info, but it should always be verified by experts and accompanied by professional evaluations before being used in court.”

The judge concluded that “it would seem that even Copilot itself self-checks and relies on human oversight and analysis. It is clear from these responses that the developers of the Copilot program recognise the need for its supervision by a trained human operator to verify the accuracy of the submitted information as well as the output.”

Although arguing that the use of AI tools for drafting expert reports was generally accepted in the field of fiduciary services, he was unable to name any publications regarding its use or sources to confirm its use was generally accepted.

The court found that, due to the rapid evolution of artificial intelligence and its inherent reliability issues, prior to evidence being introduced, which has been generated by an artificial intelligence product or system, counsel has an affirmative duty to disclose the use of artificial intelligence. Permission to use such evidence should properly be the subject of a pre-trial hearing, with the scope of the evidence determined by the court.

Learning points

Learning points for experts:

  • Do not use an AI tool unless you fully understand it and can explain how the tool works, how you have used it, how it generated the results, and what the results mean.
  • If you wish to use an AI tool, tell your instructing party as soon as possible which tool you intend to use, explaining why you require it and how you intend to use it.
  • Disclose your use of the AI tool to all parties and the court, clearly setting out the prompts and configurations you have used.
  • Make sure you use the AI tool thoughtfully and be extremely careful before entering any confidential or personal information. Remember that AI tools will generally retain any information you enter and the organisation managing the tool may use that information to provide services to third parties.

Learning points for instructing parties:

  • Ask the experts you instruct whether they intend to use any AI tools.
  • If they intend to use an AI tool, make them aware of any guidance or case law on the use of AI tools in expert evidence.
  • Ensure that any use of AI tools is disclosed to the court and the other party.
  • Ideally, you should include the request to use AI in your application for permission to rely on expert evidence, explaining why the tool is needed, how it will be used, why it is reliable, and how any confidential or personal information will be protected.  

Lastly, Sir Keith Lindblom’s keynote address to the 2024 EWI Conference is available in the EWI web shop if you want to hear some thoughts from the judiciary on the role of AI in expert evidence.

Similar Posts