{"id":31441,"date":"2025-07-29T08:50:20","date_gmt":"2025-07-29T08:50:20","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/31441\/"},"modified":"2025-07-29T08:50:20","modified_gmt":"2025-07-29T08:50:20","slug":"a-copyright-lawsuit-over-pirated-books-could-result-in-business-ending-damages-for-anthropic","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/31441\/","title":{"rendered":"A copyright lawsuit over pirated books could result in \u2018business-ending\u2019 damages for Anthropic"},"content":{"rendered":"<p>The class-action lawsuit against the company centers on Anthropic\u2019s use of potentially pirated books to train its large language model, Claude, and could leave the company on the hook for billions of dollars\u2019 worth of damages.<\/p>\n<p>According to court filings, the company downloaded millions of copyrighted works from shadow libraries like LibGen and PiLiMi to train AI models and build a \u201ccentral library\u201d of digital books that would include \u201call the books in the world\u201d and preserve them indefinitely. The plaintiffs\u2014who include authors Andrea Bartz,\u00a0Charles Graeber, and Kirk Wallace Johnson\u2014allege that millions of these works were obtained from piracy websites in direct violation of copyright law.<\/p>\n<p>The judge presiding over the case, William Alsup, has recently ruled that training AI models on lawfully acquired books qualifies as \u201cfair use,\u201d and that AI companies do not need a license from copyright holders to conduct such training, a decision that was viewed as a major win for the AI sector.<\/p>\n<p>However, the still unresolved issue is how Anthropic obtained and stored the copyrighted books. The judge drew a distinction when it came to the use of pirated materials, advising Anthropic that a separate trial \u201con the pirated copies\u201d and \u201cthe resulting damages\u201d would be forthcoming.<\/p>\n<p>\u201cThe problem is that a lot of these AI companies have scraped piracy sites like LibGen \u2026 where books have been uploaded in electronic form, usually PDF, without the permission of the authors, without payment,\u201d Luke McDonagh, an associate professor of law at LSE, told Fortune.<\/p>\n<p>\u201cThe judge seems to be suggesting that if you had bought a million books from <a href=\"https:\/\/fortune.com\/company\/amazon-com\/\" target=\"_blank\" aria-label=\"Go to https:\/\/fortune.com\/company\/amazon-com\/\" class=\"sc-19cc8fd2-0 iHosVH\" rel=\"nofollow noopener\">Amazon<\/a> in digital form, then you could do the training, and that would be legal, but it\u2019s the downloading from the pirate website that is the problem, because there\u2019s two things, there\u2019s that acquiring of the copy, and then the use of the copy,\u201d he added.<\/p>\n<p>Santa Clara law professor Ed Lee <a href=\"https:\/\/chatgptiseatingtheworld.com\/2025\/07\/17\/anthropic-faces-potential-business-ending-liability-in-statutory-damages-after-judge-alsup-certifies-class-action-by-bartz\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\" aria-label=\"Go to https:\/\/chatgptiseatingtheworld.com\/2025\/07\/17\/anthropic-faces-potential-business-ending-liability-in-statutory-damages-after-judge-alsup-certifies-class-action-by-bartz\/\" class=\"sc-19cc8fd2-0 iHosVH\">said<\/a>\u00a0in a blog post that the ruling could leave Anthropic facing \u201cat least the potential for business-ending liability.\u201d<\/p>\n<p>The plaintiffs are unlikely to prove direct financial harm, such as lost sales, and are likely to instead rely on statutory damages, which can range from $750 to $150,000 per work. That range depends heavily on whether the infringement is deemed willful. If the court rules that Anthropic knowingly violated copyright law, the resulting fines could be enormous, potentially in the billions, even at the lower end of the scale.<\/p>\n<p>The number of works included in the class action and whether the jury finds willful infringement is still a question mark, but potential damages could range from hundreds of millions to tens of billions of dollars. Even at the low end, Lee argues, damages in the range of $1 billion to $3 billion are possible if just 100,000 works are included in the class action. That figure rivals the largest copyright damage awards on record and could far exceed Anthropic\u2019s current $4 billion in annual revenue.<\/p>\n<p>Lee estimated that the company could be on the hook for up to $1.05 trillion if a jury decides that the company willfully pirated 6 million copyrighted books.<\/p>\n<p>Anthropic did not immediately respond to a request for comment from Fortune. However, the company has previously said it \u201crespectfully disagrees\u201d with the court\u2019s decision and is exploring its options, which might include appealing Alsup\u2019s ruling or offering to settle the case. A trial, which is the first case of a certified class action against an AI company over the use of copyrighted materials, is currently scheduled for Dec. 1.<\/p>\n<p>The verdict could determine the outcomes of similar cases, such as a high-profile ongoing battle between OpenAI and dozens of authors and publishers. While the courts do appear to be leaning toward allowing fair use arguments from AI companies, there\u2019s a legal divergence regarding the acquisition of copyrighted works from shadow sites.<\/p>\n<p>In a recent copyright case against <a href=\"https:\/\/fortune.com\/company\/facebook\/\" target=\"_blank\" aria-label=\"Go to https:\/\/fortune.com\/company\/facebook\/\" class=\"sc-19cc8fd2-0 iHosVH\" rel=\"nofollow noopener\">Meta<\/a>, Judge Vince Chhabria argued that the transformative purpose of the AI use effectively legitimizes the earlier unauthorized downloading. The ruling, according to McDonagh, suggested that the positive, transformative use of the works could \u201ccorrect\u201d the initial problematic acquisition, whereas Judge Alsup viewed the downloading of books from unauthorized shadow libraries as \u201cinherently wrong,\u201d suggesting that even if the AI training use might be considered fair use, the initial acquisition of works was illegitimate and would need compensation.<\/p>\n<p>The two judges also diverged on whether AI-generated outputs could be deemed to compete with the original copyrighted works in their training data. Judge Chhabria acknowledged that if such competition was proved it might undercut a fair use defense but found that, in the Meta case, the plaintiffs had failed to provide sufficient evidence of market harm, whereas Judge Alsup concluded that generative AI outputs do not compete with the original works at all.<\/p>\n<p>The legal question around AI companies and copyrighted work has also become increasingly political, with the current administration pushing to allow AI companies to use copyrighted materials for training under broad fair use protections, in an effort to maintain U.S. leadership in artificial intelligence. McDonagh said the case against Anthropic was unlikely to leave the company bankrupt, as the Trump administration would be unlikely to allow a ruling that would essentially destroy an AI company.<\/p>\n<p>Judges are also generally averse to issuing rulings that could lead to bankruptcy unless there is a strong legal basis and the action is deemed necessary.\u00a0Courts have been known to consider the potential impact on the company and its stakeholders when issuing rulings that could result in liquidation.<\/p>\n<p>\u201cThe U.S. Supreme Court, at the moment, seems quite friendly to the Trump agenda, so it\u2019s quite likely that in the end, this wouldn\u2019t have been the kind of doomsday scenario of the copyright ruling bankrupting Anthropic,\u201d McDonagh said. \u201cAnthropic is now valued, depending on different estimates, between $60 and $100 billion. So paying a couple of billion to the authors would by no means bankrupt the organization.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"The class-action lawsuit against the company centers on Anthropic\u2019s use of potentially pirated books to train its large&hellip;\n","protected":false},"author":2,"featured_media":31442,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[30],"tags":[353,49,48,75,5433],"class_list":{"0":"post-31441","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-books","8":"tag-books","9":"tag-ca","10":"tag-canada","11":"tag-entertainment","12":"tag-law"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/31441","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=31441"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/31441\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/31442"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=31441"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=31441"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=31441"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}