Key Takeaways

Attorneys should be wary of allowing experts to use AI technology to draft any portion of expert testimony. An expert who relies on AI technology should carefully review the output for AI hallucinations and ensure that their written opinions and citations are their own—as should counsel in the case.

An AI expert’s testimony that relied on “hallucination” citations to support substantive claims is being challenged under Daubert and Rule 702.

In a lawsuit challenging a Minnesota law that criminalizes the use of deepfakes (AI-generated videos or media), the defendant’s AI expert submitted a declaration that cited AI “hallucinations”— artificially created data (here, citations) that appear legitimate but do not actually exist. On this basis, the plaintiffs moved to exclude the expert’s testimony under Daubert and Rule 702. Mot. Exclude Expert Test., Kohls v. Ellison, No. 0:24-cv-3754-LMP-DLM (D. Minn. Nov. 15, 2024), ECF No. 29.

Minnesota’s deepfake law was passed in 2023 with the intent of safeguarding elections. A popular social media comedian, along with a Minnesota State Representative, sued to block enforcement of the law on the basis that it was an unconstitutional limitation on free speech. Compl., (D. Minn. Sept. 27, 2024), ECF No. 1. The expert at issue—a Stanford professor and leading scholar in the area—submitted a declaration in support of the State’s opposition to the plaintiffs’ motion for a preliminary injunction. Aff. Prof. Jeff Hancock, (D. Minn. Nov. 1, 2024), ECF No. 23.

The expert has since acknowledged and sought leave to correct three citation errors in his declaration, including two “hallucinations” to nonexistent studies. Decl. Prof. Jeff Hancock Supp. Mot., (D. Minn. Nov. 27, 2024), ECF No. 39. He explained that the citations were generated by an AI tool called GPT-4o after he prompted the tool to draft paragraphs based on bullet points that he wrote, but he left the term “‘[cite]’ as a placeholder to remind [himself] to go back and add the academic citation” for each bullet point. Id. GPT-4o then replaced “[cite]” with hallucinated sources, which went unnoticed by the expert in his review of the drafted paragraphs. Id. Despite these errors, he said he “stand[s] firmly behind all of the substantive points in the declaration” and asserted that “correcting these errors does not in any way alter [his] original conclusions.” Id.

The plaintiffs have sought the exclusion of the expert’s testimony. They highlight the irony that an expert on AI misinformation cited studies that were not real. Mem. Supp. Re Mot. Exclude Expert Test., (D. Minn. Nov. 16, 2024), ECF No. 30. The plaintiffs also contended that the fake studies cited by the expert were important for his substantive claims about the dangers of deepfakes, which undermined the reliability of his entire declaration. Id. On these grounds, the plaintiffs argue that the declaration should be excluded. The State opposed the plaintiffs’ motion to exclude and filed a motion for leave to submit the expert’s amended declaration. Mot. Leave File Am. Expert Dec., (D. Minn. Nov. 27, 2024), ECF No. 34. The court has not yet resolved either motion.


Contributors

*The Re:Torts team would like to thank Brooke Meadowcroft for her contribution to this article.