Anthropic Settles Landmark AI Copyright Lawsuit with Authors

Artificial intelligence company Anthropic has reached a settlement with a group of authors in a copyright infringement lawsuit that could have exposed the company to billions in damages.

The settlement, announced Tuesday in federal court filings, resolves a class-action lawsuit brought by authors Andrea Bartz, Charles Graebe, and Kirk Wallace Johnson, who accused the San Francisco-based company of illegally downloading millions of copyrighted books to train its Claude chatbot.

Read: Authors Sue Anthropic for Alleged ‘Large-Scale Theft’ of Copyrighted Books

“This historic settlement will benefit all class members,” said Justin Nelson, an attorney for the authors, though financial terms were not disclosed. Legal experts estimate the settlement could reach hundreds of millions of dollars.

The case centered on Anthropic’s use of approximately 7 million books downloaded from pirate websites, including Library Genesis and Pirate Library Mirror. While US District Judge William Alsup ruled in June that training AI models on legally obtained copyrighted works constitutes “fair use,” he found that Anthropic’s acquisition of pirated books violated copyright law.

One expert estimated potential damages could have topped $900 billion if a jury found willful infringement. The company, backed by Amazon (Nasdaq: AMZN) and Alphabet (Nasdaq: GOOG), expects to generate no more than $5 billion in revenue this year while operating at a loss.

Anthropic still faces ongoing copyright challenges. Universal Music Group and other publishers are pursuing a separate lawsuit over the alleged unauthorized use of song lyrics, while Reddit has sued the company for allegedly scraping content from its platform without permission.

Read: Music Publishers File Preliminary Injunction Against Anthropic, Seeking to Halt ‘Improper Use of Copyrighted Works’

The case was scheduled to go to trial in December and represented the first certified class action against an AI company over copyrighted materials. The resolution could influence how similar disputes are resolved and may push the industry toward licensing deals with content creators.

Anthropic declined to comment on the settlement terms, which are expected to be finalized by September 5.



Information for this story was found via the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Video Articles

Silver Is in a New Price Regime, and the Market Isn’t Used to It | Keith Neumeyer – First Majestic

Agnico Eagle Just Made a Massive Gold Land Grab

A Copper-Gold Deposit Caught the White House’s Attention | Rob McLeod – Cambria Gold

Recommended

Mercado Drills 256 g/t Silver Over 6.5 Metres In First Drill Hole of Inaugural Program

Antimony Resources Drills 4.38% Sb Over 7.05 Metres At Bald Hill In Final Hole Of 2025 Program

Related News

The Pentagon Banned Anthropic — Then Gave OpenAI the Same Deal

President Donald Trump ordered every federal agency to cut ties with Anthropic on Friday, and...

Monday, March 2, 2026, 11:07:00 AM

Anthropic’s Mythos AI Set for Federal Deployment Amid Contract Tensions with Pentagon

The U.S. government is preparing to roll out a modified version of Anthropic’s frontier AI...

Thursday, April 16, 2026, 02:54:36 PM

Kalshi Faces Class Action Lawsuit Over Alleged Illegal Sports Betting

Prediction market platform Kalshi is facing a nationwide class action lawsuit alleging it operates an...

Friday, November 28, 2025, 03:07:00 PM

17 States Sue Trump for Blocking $5 Billion in Electric Vehicle Charging Funds

17 states led by Washington, Colorado, and California have filed a lawsuit against the Trump...

Wednesday, May 14, 2025, 09:10:00 AM

US Steel Sues Canadian Steelmaker Over Broken Contract

US Steel has filed a lawsuit against Algoma Steel (TSX: ASTL), one of Canada’s largest...

Thursday, October 9, 2025, 10:16:00 AM