{"id":299065,"date":"2025-11-18T12:25:13","date_gmt":"2025-11-18T12:25:13","guid":{"rendered":"https:\/\/www.newsbeep.com\/us\/299065\/"},"modified":"2025-11-18T12:25:13","modified_gmt":"2025-11-18T12:25:13","slug":"ai-generated-evidence-showing-up-in-court-alarms-judges","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/us\/299065\/","title":{"rendered":"AI-generated evidence showing up in court alarms judges"},"content":{"rendered":"<p id=\"anchor-0580d3\" class=\"body-graf\">Judge Victoria Kolakowski sensed something was wrong with Exhibit 6C.<\/p>\n<p id=\"anchor-e1ee63\" class=\"body-graf\">Submitted by the plaintiffs in a California housing dispute, <a href=\"https:\/\/drive.google.com\/file\/d\/1h1ae0izs07kGdF3HKALRvla-cgB1E1gF\/view\" target=\"_blank\" rel=\"nofollow noopener\">the video<\/a> showed a witness whose voice was disjointed and monotone, her face fuzzy and lacking emotion. Every few seconds, the witness would twitch and repeat her expressions.<\/p>\n<p id=\"anchor-b02023\" class=\"body-graf\">Kolakowski, who serves on California\u2019s Alameda County Superior Court, soon realized why: The video had been produced using generative artificial intelligence. Though the video claimed to feature a real witness \u2014 who had appeared in another, authentic piece of evidence \u2014 Exhibit 6C was an AI \u201cdeepfake,\u201d Kolakowski said.<\/p>\n<p id=\"anchor-a17edf\" class=\"body-graf\">The case, Mendones v. Cushman &amp; Wakefield, Inc., appears to be one of the <a href=\"https:\/\/judgeschlegel.com\/blog\/what-happens-when-ai-deepfakes-fool-a-judge\" target=\"_blank\" rel=\"nofollow noopener\">first instances<\/a> in which a suspected deepfake was submitted as purportedly authentic evidence in court and detected \u2014 a sign, <a href=\"https:\/\/www.law.com\/therecorder\/2025\/09\/25\/alameda-county-judge-says-plaintiff-used-ai-generated-fake-evidence-tosses-case\/?slreturn=20251024124529\" target=\"_blank\" rel=\"nofollow noopener\">judges<\/a> and <a href=\"https:\/\/lexara.com\/articles\/the-trust-apocalypse-and-deepfakes-in-legal-practice\" target=\"_blank\" rel=\"nofollow noopener\">legal experts<\/a> said, of a much larger threat. <\/p>\n<p id=\"anchor-a197e5\" class=\"body-graf\">Citing the plaintiffs\u2019 use of AI-generated material masquerading as real evidence, Kolakowski dismissed the case on Sept. 9. The plaintiffs sought reconsideration of her decision, arguing the judge suspected but failed to prove that the evidence was AI-generated. Judge Kolakowski denied their request for reconsideration on Nov. 6. The plaintiffs did not respond to a request for comment.<\/p>\n<p id=\"anchor-7613eb\" class=\"body-graf\">With the rise of powerful AI tools, AI-generated content is increasingly finding its way into courts, and some judges are worried that hyperrealistic fake evidence will soon flood their courtrooms and threaten their fact-finding mission. <\/p>\n<p id=\"anchor-7ed3a5\" class=\"body-graf\">NBC News spoke to five judges and 10 legal experts who warned that the rapid advances in generative AI \u2014 now capable of producing convincing fake videos, images, documents and audio \u2014 could erode the foundation of trust upon which courtrooms stand. Some judges are trying to raise awareness and calling for action around the issue, but the process is just beginning.<\/p>\n<p id=\"anchor-6dea9f\" class=\"body-graf\">\u201cThe judiciary in general is aware that big changes are happening and want to understand AI, but I don\u2019t think anybody has figured out the full implications,\u201d Kolakowski told NBC News. \u201cWe\u2019re still dealing with a technology in its infancy.\u201d<\/p>\n<p id=\"anchor-422d91\" class=\"body-graf\">Prior to the Mendones case, courts have <a href=\"https:\/\/law.justia.com\/cases\/pennsylvania\/superior-court\/2024\/2032-eda-2023.html\" target=\"_blank\" rel=\"nofollow noopener\">repeatedly dealt<\/a> with <a href=\"https:\/\/cases.justia.com\/federal\/district-courts\/nevada\/nvdce\/2:2025cv00504\/173684\/107\/0.pdf\" target=\"_blank\" rel=\"nofollow noopener\">a phenomenon<\/a> billed as <a href=\"https:\/\/www.californialawreview.org\/print\/deep-fakes-a-looming-challenge-for-privacy-democracy-and-national-security\" target=\"_blank\" rel=\"nofollow noopener\">the \u201cLiar\u2019s Dividend,<\/a>\u201d \u2014 when plaintiffs and defendants invoke the possibility of generative AI involvement to <a href=\"https:\/\/www.brookings.edu\/articles\/watch-out-for-false-claims-of-deepfakes-and-actual-deepfakes-this-election-year\/\" target=\"_blank\" rel=\"nofollow noopener\">cast doubt on actual, authentic evidence<\/a>. But in the Mendones case, the court found the plaintiffs attempted the opposite: to falsely admit AI-generated video as genuine evidence. <\/p>\n<p id=\"anchor-c7ac5c\" class=\"body-graf\">Judge Stoney Hiljus, who serves in Minnesota\u2019s 10th Judicial District and is chair of the Minnesota Judicial Branch\u2019s AI Response Committee, said the case brings to the fore a growing concern among judges. <\/p>\n<p id=\"anchor-b22ecd\" class=\"body-graf\">\u201cI think there are a lot of judges in fear that they\u2019re going to make a decision based on something that\u2019s not real, something AI-generated, and it\u2019s going to have real impacts on someone\u2019s life,\u201d he said.<\/p>\n<p id=\"anchor-c3a705\" class=\"body-graf\">Many judges across the country agree, even those who advocate for the use of AI in court. Judge Scott Schlegel serves on the Fifth Circuit Court of Appeal in Louisiana and is a <a href=\"https:\/\/judgeschlegel.substack.com\/p\/ai-in-chambers-a-framework-for-judicial\" target=\"_blank\" rel=\"nofollow noopener\">leading advocate<\/a> for judicial adoption of AI technology, but he also worries about the risks generative AI poses to the pursuit of truth. <\/p>\n<p id=\"anchor-2e31df\" class=\"body-graf\">\u201cMy wife and I have been together for over 30 years, and she has my voice everywhere,\u201d Schlegel said. \u201cShe could easily clone my voice on free or inexpensive software to create a threatening message that sounds like it\u2019s from me and walk into any courthouse around the country with that recording.\u201d<\/p>\n<p id=\"anchor-c1cba1\" class=\"body-graf\">\u201cThe judge will sign that restraining order. They will sign every single time,\u201d said Schlegel, referring to the hypothetical recording. \u201cSo you lose your cat, dog, guns, house, you lose everything.\u201d<\/p>\n<p id=\"anchor-7aee8a\" class=\"body-graf\">Judge Erica Yew, a member of California\u2019s Santa Clara County Superior Court since 2001, is passionate about AI\u2019s use in the court system and its potential to increase access to justice. Yet she also acknowledged that forged audio could easily lead to a protective order and advocated for more centralized tracking of such incidents. \u201cI am not aware of any repository where courts can report or memorialize their encounters with deep-faked evidence,\u201d Yew told NBC News. \u201cI think AI-generated fake or modified evidence is happening much more frequently than is reported publicly.\u201d<\/p>\n<p id=\"anchor-487539\" class=\"body-graf\">Yew said she is concerned that deepfakes could corrupt other, long-trusted methods of obtaining evidence in court. With AI, \u201csomeone could easily generate a false record of title and go to the county clerk\u2019s office,\u201d for example, to establish ownership of a car. But the county clerk likely will not have the expertise or time to check the ownership document for authenticity, Yew said, and will instead just enter the document into the official record.<\/p>\n<p id=\"anchor-e0d0c2\" class=\"body-graf\">\u201cNow a litigant can go get a copy of the document and bring it to court, and a judge will likely admit it. So now do I, as a judge, have to question a source of evidence that has traditionally been reliable?\u201d Yew wondered. <\/p>\n<p id=\"anchor-913ef1\" class=\"body-graf\">Though fraudulent evidence has long been an issue for the courts, Yew said AI could cause an unprecedented expansion of realistic, falsified evidence. \u201cWe\u2019re in a whole new frontier,\u201d Yew said.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/www.newsbeep.com\/us\/wp-content\/uploads\/2025\/11\/251114-Judge-Erica-Yew-gk-85364e.jpg\" alt=\"Judge Erica Yew.\" height=\"2500\" width=\"2000\"\/>Santa, Calif., Clara County Superior Court Judge Erica Yew.Courtesy of Erica Yew<\/p>\n<p id=\"anchor-34c320\" class=\"body-graf\">Schlegel and Yew are among a small group of judges leading efforts to address the emerging threat of deepfakes in court. They are joined by <a href=\"https:\/\/www.ncsc.org\/our-centers-projects\/trincsc-ai-policy-consortium-law-courts\" target=\"_blank\" rel=\"nofollow noopener\">a consortium<\/a> of the <a href=\"https:\/\/www.ncsc.org\/about-us\" target=\"_blank\" rel=\"nofollow noopener\">National Center for State Courts<\/a> and the <a href=\"https:\/\/www.thomsonreuters.com\/en\/institute\" target=\"_blank\" rel=\"nofollow noopener\">Thomson Reuters Institute<\/a>, which has created resources for judges to address the growing deepfake quandary. <\/p>\n<p id=\"anchor-2845ea\" class=\"body-graf\">The <a href=\"https:\/\/www.ncsc.org\/resources-courts\/artificial-intelligence-ai\" target=\"_blank\" rel=\"nofollow noopener\">consortium<\/a> labels deepfakes as \u201cunacknowledged AI evidence\u201d to distinguish these creations from \u201cacknowledged AI evidence\u201d like AI-generated accident reconstruction videos, which are recognized by all parties as AI-generated.<\/p>\n<p id=\"anchor-981585\" class=\"body-graf\">Earlier this year, the consortium published a <a href=\"https:\/\/nationalcenterforstatecourts.app.box.com\/s\/vnsf0jeky3k3re4093leqcfvudf7c0k6\" target=\"_blank\" rel=\"nofollow noopener\">cheat sheet<\/a> to help judges deal with deepfakes. The document advises judges to ask those providing potentially AI-generated evidence to explain its origin, reveal who had access to the evidence, share whether the evidence had been altered in any way and look for corroborating evidence. <\/p>\n<p id=\"anchor-dc8367\" class=\"body-graf\">In April 2024, <a href=\"https:\/\/www.nbcnews.com\/news\/us-news\/washington-state-judge-blocks-use-ai-enhanced-video-evidence-rcna141932\" target=\"_blank\" rel=\"nofollow noopener\">a Washington state judge denied<\/a> a defendant\u2019s efforts to use an AI tool to clarify a video that had been submitted.  <\/p>\n<p id=\"anchor-94c634\" class=\"body-graf\">Beyond this cadre of advocates, judges around the country are starting to take note of AI\u2019s impact on their work, according to Hiljus, the Minnesota judge.<\/p>\n<p id=\"anchor-530143\" class=\"body-graf\">\u201cJudges are starting to consider, is this evidence authentic? Has it been modified? Is it just plain old fake? We\u2019ve learned over the last several months, especially with OpenAI\u2019s Sora coming out, that it\u2019s not very difficult to make a really realistic video of someone doing something they never did,\u201d Hiljus said. \u201cI hear from judges who are really concerned about it and who think that they might be seeing AI-generated evidence but don\u2019t know quite how to approach the issue.\u201d <\/p>\n<p id=\"anchor-0936f7\" class=\"body-graf\">Hiljus is currently surveying state judges in Minnesota to better understand how generative AI is showing up in their courtrooms. <\/p>\n<p id=\"anchor-c2a5b6\" class=\"body-graf\">To address the rise of deepfakes, several judges and legal experts are advocating for changes to judicial rules and guidelines on how attorneys verify their evidence. By law and in concert with the Supreme Court, the U.S. Congress establishes the <a href=\"https:\/\/www.uscourts.gov\/file\/78325\/download\" target=\"_blank\" rel=\"nofollow noopener\">rules<\/a> for how evidence is used in lower courts.<\/p>\n<p id=\"anchor-0c9f78\" class=\"body-graf\">One proposal crafted by Maura R. Grossman, a research professor of computer science at the University of Waterloo and a practicing lawyer, and Paul Grimm, a professor at Duke Law School and former federal district judge, would require parties alleging that the opposition used deepfakes to thoroughly substantiate their arguments. <a href=\"https:\/\/www.uscourts.gov\/sites\/default\/files\/2025-04\/25-ev-a_suggestion_from_prof._rebecca_delfino_-_rule_901.pdf\" target=\"_blank\" rel=\"nofollow noopener\">Another proposal<\/a> would transfer the duty of deepfake identification from impressionable juries to judges. <\/p>\n<p id=\"anchor-8b1c82\" class=\"body-graf\">The proposals were considered by the U.S. Judicial Conference\u2019s Advisory Committee on Evidence Rules <a href=\"https:\/\/www.uscourts.gov\/forms-rules\/records-rules-committees\/agenda-books\/advisory-committee-evidence-rules-may-2025\" target=\"_blank\" rel=\"nofollow noopener\">when it conferred in May<\/a>, but they were not approved. Members argued \u201cexisting standards of authenticity are up to the task of regulating AI evidence.\u201d The U.S. Judicial Conference is a voting body of 26 federal judges, overseen by the chief justice of the Supreme Court. After a committee recommends a change to judicial rules, the conference votes on the proposal, which is then reviewed by the Supreme Court and voted upon by Congress.<\/p>\n<p id=\"anchor-fe3655\" class=\"body-graf\">Despite opting not to move the rule change forward for now, the committee was eager to keep a deepfake evidence rule \u201cin the bullpen in case the Committee decides to move forward with an AI amendment in the future,\u201d according to <a href=\"https:\/\/www.uscourts.gov\/sites\/default\/files\/document\/2025-05_evidence_rules_committee_agenda_book_final.pdf\" target=\"_blank\" rel=\"nofollow noopener\">committee notes<\/a>. <\/p>\n<p id=\"anchor-fe18e4\" class=\"body-graf\">Grimm was pessimistic about this decision given how quickly the AI ecosystem is evolving. By his accounting, it takes a minimum of three years for a new federal rule on evidence to be adopted.<\/p>\n<p id=\"anchor-a6c21d\" class=\"body-graf\">The Trump administration\u2019s <a href=\"https:\/\/www.whitehouse.gov\/wp-content\/uploads\/2025\/07\/Americas-AI-Action-Plan.pdf\" target=\"_blank\" rel=\"nofollow noopener\">AI Action Plan<\/a>, released in July as the administration\u2019s road map for American AI efforts, highlights the need to \u201ccombat synthetic media in the court system\u201d and advocates for exploring deepfake-specific standards similar to the proposed evidence rule changes. <\/p>\n<p id=\"anchor-a07e7e\" class=\"body-graf\">Yet other law practitioners think a cautionary approach is wisest, waiting to see how often deepfakes are really passed off as evidence in court and how judges react before moving to update overarching rules of evidence. <\/p>\n<p id=\"anchor-3c6810\" class=\"body-graf\">Jonathan Mayer, the former chief science and technology adviser and chief AI officer at the U.S. Justice Department under President Joe Biden and now a professor at Princeton University, told NBC News he routinely encountered the issue of AI in the court system: \u201cA recurring question was whether effectively addressing AI abuses would require new law, including new statutory authorities or court rules.\u201d<\/p>\n<p id=\"anchor-6ac3ec\" class=\"body-graf\">\u201cWe generally concluded that existing law was sufficient,\u201d he said. However, \u201cthe impact of AI could change \u2014 and it could change quickly \u2014 so we also thought through and prepared for possible scenarios.\u201d<\/p>\n<p id=\"anchor-4b4378\" class=\"body-graf\">In the meantime, attorneys may become the first line of defense against deepfakes invading U.S. courtrooms. <\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/www.newsbeep.com\/us\/wp-content\/uploads\/2025\/11\/251114-Judge-Scott-Schlegel-gk-db2845.jpg\" alt=\"Judge Scott Schlegel.\" height=\"2500\" width=\"2000\"\/>Louisiana Fifth Circuit Court of Appeal Judge Scott Schlegel.Courtesy of Scott Schlegel<\/p>\n<p id=\"anchor-f0e044\" class=\"body-graf\">Judge Schlegel pointed to Louisiana\u2019s <a href=\"https:\/\/www.legis.la.gov\/legis\/ViewDocument.aspx?d=1425558\" target=\"_blank\" rel=\"nofollow noopener\">Act 250<\/a>, passed earlier this year, as a successful and effective way to change norms about deepfakes at the state level. The act mandates that attorneys exercise \u201creasonable diligence\u201d to determine if evidence they or their clients submit has been generated by AI. <\/p>\n<p id=\"anchor-b08b86\" class=\"body-graf\">\u201cThe courts can\u2019t do it all by themselves,\u201d Schlegel said. \u201cWhen your client walks in the door and hands you 10 photographs, you should ask them questions. Where did you get these photographs? Did you take them on your phone or a camera?\u201d<\/p>\n<p id=\"anchor-d7d122\" class=\"body-graf\">\u201cIf it doesn\u2019t smell right, you need to do a deeper dive before you offer that evidence into court. And if you don\u2019t, then you\u2019re violating your duties as an officer of the court,\u201d he said.<\/p>\n<p id=\"anchor-c83be9\" class=\"body-graf\">Daniel Garrie, co-founder of cybersecurity and digital forensics company Law &amp; Forensics, said that human expertise will have to continue to <a href=\"https:\/\/www.law.com\/legaltechnews\/2025\/01\/16\/facing-the-deepfake-crisis-insights-from-enterprise-and-legal-frontlines\/?slreturn=20251112000618\" target=\"_blank\" rel=\"nofollow noopener\">supplement digital-only efforts<\/a>. <\/p>\n<p id=\"anchor-263c87\" class=\"body-graf\">\u201cNo tool is perfect, and frequently additional facts become relevant,\u201d Garrie wrote via email. \u201cFor example, it may be impossible for a person to have been at a certain location if GPS data shows them elsewhere at the time a photo was purportedly taken.\u201d<\/p>\n<p id=\"anchor-021c7b\" class=\"body-graf\">Metadata \u2014 or the invisible descriptive data attached to files that describe facts like the file\u2019s origin, date of creation and date of modification  \u2014 could be a key defense against deepfakes in the near future. <\/p>\n<p id=\"anchor-72bbc7\" class=\"body-graf\">For example, in the Mendones case, the court found the metadata of one of the purportedly-real-but-deepfaked videos showed that the plaintiffs\u2019 video was captured on an iPhone 6, which was impossible given that the plaintiff\u2019s argument required capabilities only available on an iPhone 15 or newer. <\/p>\n<p id=\"anchor-362553\" class=\"body-graf\">Courts could also mandate that video- and audio-recording hardware include robust mathematical signatures attesting to the provenance and authenticity of their outputs, allowing courts to verify that content was recorded by actual cameras. <\/p>\n<p id=\"anchor-8bd6c1\" class=\"body-graf\">Such technological solutions may still run into critical stumbling blocks similar to those that plagued prior legal efforts to adapt to new technologies, like <a href=\"https:\/\/nij.ojp.gov\/nij-hosted-online-training-courses\/what-every-law-enforcement-officer-should-know-about-dna\/sources-locations-and-limitations\/limitations-dna-evidence\" target=\"_blank\" rel=\"nofollow noopener\">DNA testing<\/a> or even <a href=\"https:\/\/massbar.org\/publications\/podcast\/lawyers-journal-2006-march\/fingerprint-identification-and-its-aura-of-infallibility\" target=\"_blank\" rel=\"nofollow noopener\">fingerprint analysis<\/a>. Parties lacking the latest technical AI and deepfake know-how may face a disadvantage in proving evidence\u2019s origin.<\/p>\n<p id=\"anchor-c8f2ea\" class=\"body-graf\">Grossman, the University of Waterloo professor, said that for now, judges need to keep their guard up.<\/p>\n<p id=\"anchor-cd4304\" class=\"body-graf\">\u201cAnybody with a device and internet connection can take 10 or 15 seconds of your voice and have a convincing enough tape to call your bank and withdraw money. Generative AI has democratized fraud.\u201d<\/p>\n<p id=\"anchor-d21e3e\" class=\"endmark body-graf\">\u201cWe\u2019re really moving into a new paradigm,\u201d Grossman said. \u201cInstead of trust but verify, we should be saying: Don\u2019t trust and verify.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"Judge Victoria Kolakowski sensed something was wrong with Exhibit 6C. Submitted by the plaintiffs in a California housing&hellip;\n","protected":false},"author":2,"featured_media":299066,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[45],"tags":[182,181,507,74],"class_list":{"0":"post-299065","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/299065","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/comments?post=299065"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/299065\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media\/299066"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media?parent=299065"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/categories?post=299065"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/tags?post=299065"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}