AI Trained on Your Books? What the Bartz v. Anthropic Lawsuit Means for Authors

What happens when AI eats your book without asking?

That’s the question at the heart of Bartz v. Anthropic, a landmark lawsuit that could decide whether authors get paid when their books are used to train AI.

If you’ve published a book (self-published or traditionally published), your work could already be part of this fight. Anthropic, the company behind the Claude AI models, is accused of downloading millions of copyrighted books (many from pirate libraries) to feed its AI.

The case could reshape how AI companies license creative works, how royalties are handled, and how much control authors have over their books in the age of AI.

Billions of dollars in damages are on the line (and so is the precedent that could guide every future lawsuit like it).

But the legal landscape is messy…

Between overlapping lawsuits, split rulings, and shifting strategies, it’s easy to get lost in the details.

That’s why we put together this guide. In the sections below, we’ll break down:

  • What this lawsuit is about (and why it matters)
  • The key rulings so far and what’s next
  • How the Authors Guild is getting involved
  • What this could mean for your rights as an author
  • How you can sign up to stay informed (and possibly benefit)

By the end, you’ll know exactly what’s happening, why it matters, and what to do next if your books are affected.

IMPORTANT:

I’m not a lawyer (much to my mother’s dismay), so this article shouldn’t be taken as legal advice. However, I’ve done my best to gather every piece of relevant information authors need to know about Bartz v. Anthropic and summarize it in a clear, easy-to-follow guide.

With that disclaimer out of the way, let’s dive right in.

We’ll start with the big, obvious question…

What is the Bartz v. Anthropic Lawsuit About?

In late 2023, three authors (Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson) filed a class-action lawsuit against Anthropic, the company behind Claude.

Their claim is simple: Anthropic used their books, along with millions of others, to train its AI without permission.

According to the lawsuit, Anthropic allegedly:

  • Downloaded huge datasets of copyrighted books from pirate libraries like Library Genesis and Pirate Library Mirror.
  • Scanned and digitized some legally purchased books to include in its AI training dataset.
  • Built a “central library” of text to train Claude, its family of large language models.

Why does this matter?

Because if you’ve published a book, whether you self-published on Amazon or released traditionally through a major house, your work may have been part of these datasets.

And this isn’t just about Anthropic…

The case could shape how all AI companies handle copyrighted works moving forward. It could also decide whether authors deserve compensation when their books are used to train AI.

The Key Rulings So Far

In June 2025, U.S. District Judge William Alsup issued a partial summary judgment in the Bartz v. Anthropic case.

His ruling was a mixed bag: some of Anthropic’s practices were allowed under copyright law, but others will be decided at trial.

Here’s the simplified breakdown:

IssueJudge Alsup’s RulingWhat It Means for Authors
Training Claude on legally purchased booksFair use ✔️No damages risk there
Digitizing print books Anthropic boughtFair use ✔️Safe practice
Training Claude on pirated booksNot fair use ❌Trial will decide damages

The takeaway: how Anthropic acquired the data matters. Using books the company legally bought or licensed is considered fair use under current law. But using pirated copies opens the door to significant liability.

The trial in December 2025 will focus on these pirated works and could lead to damages in the billions.

What Happens Next

With Judge Alsup’s partial ruling behind us, the case now moves toward a full trial scheduled to begin December 1, 2025.

This trial will focus on one central question:

Did Anthropic illegally use pirated books to train its Claude AI models and, if so, what are the damages?

Several key issues will be decided:

  • How many pirated works were used in training Claude
  • Whether Anthropic’s infringement was willful or unintentional
  • How much Anthropic may owe authors in statutory damages

Because Judge Alsup certified the case as a class action, the outcome will affect not just the three named plaintiffs but potentially thousands of authors whose books were scraped from pirate libraries.

The stakes are high…

Statutory damages for copyright infringement can range from $750 to $150,000 per work.

Depending on how many titles were involved, Anthropic’s potential liability could reach into the billions.

Are There Similar AI Cases Going On Right Now?

Bartz v. Anthropic isn’t the only lawsuit dealing with AI and copyrighted books, but it’s the one moving fastest and headed toward a full trial.

One related case involves the Authors Guild, a nonprofit advocacy group for writers. The Guild has launched a separate class action against Anthropic on behalf of authors whose works were allegedly downloaded from pirate libraries and used to train Claude.

We’ll explain how this case connects to Bartz in the next section, but for now, just know that thousands of authors could be represented under the Guild-backed action.

There are other AI-related lawsuits worth knowing about:

  • Kadrey v. Meta: Several authors, including Richard Kadrey, have sued Meta over the use of their books to train its AI models. This case raises similar copyright issues as Bartz v. Anthropic, but follows a different legal strategy.
  • OpenAI lawsuits: Multiple suits have been filed against OpenAI (the company behind ChatGPT) over its use of copyrighted materials, though these cases are moving more slowly than Bartz.

Even with these other cases in play, Bartz v. Anthropic is the one to watch. It’s the first major AI copyright lawsuit heading to trial, and its outcome will influence every similar case that follows.

Why There Are Two Anthropic Cases

By now, you’ve probably noticed there are two Anthropic-related lawsuits: the one filed by the three named plaintiffs (Bartz v. Anthropic) and the Authors Guild-backed class action.

Here’s how they connect:

  • Bartz v. Anthropic:
    • Filed by three individual authors: Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson
    • Focuses on Anthropic’s alleged use of pirated books to train Claude
  • Authors Guild-backed class action:
    • Filed separately to represent a broader pool of authors
    • Targets the same alleged conduct but with a larger advocacy role, helping coordinate author notifications, opt-ins, and opt-outs

Even though they started as separate cases, the court has consolidated them procedurally. That means there’s one unified trial scheduled for December 1, 2025. Evidence, rulings, and damages will be decided together.

For authors, this simplifies things: no matter which path you fall under, the outcome of this single trial determines how potential damages will be handled.

So, to quickly recap:

  • Two lawsuits, one trial
  • Bartz = named plaintiffs
  • Authors Guild-backed class = thousands of affected authors
  • Unified trial begins December 1, 2025

What This Means for Authors

For authors, this case is about more than Anthropic or Claude. It’s about setting a precedent for how creative works are treated in the age of AI.

Three big things are on the line:

  1. Control over your work:
    • If this trial results in strong protections for authors, AI companies may have to license your books before using them to train their models.
  2. Potential compensation:
    • If you’ve published a book that was scraped from pirate libraries and used without permission, you could be entitled to a portion of any damages awarded.
  3. Future publishing rights:
    • The ruling could shape how publishing contracts handle AI training rights going forward. Future deals may start including royalty structures or opt-out clauses specifically tied to AI datasets.

Even if your books weren’t used, this case sets a framework that could impact your rights, royalties, and creative control for years to come.

How Authors Can Take Part in the Lawsuits

If you’ve published a book, there’s a chance your work was included in the datasets Anthropic used to train its Claude AI models. If you’re not sure whether you’re affected, don’t worry… you don’t need to know right now.

The safest approach is to sign up to stay informed. This ensures you’ll receive notifications about eligibility, opt-in deadlines, and any potential payouts if damages are awarded.

Here are the two main resources:

  1. Bartz v. Anthropic:
    • Use this author contact form provided by Lieff Cabraser Heimann & Bernstein, the lead firm representing the named plaintiffs.
    • Signing up ensures you’ll get updates specific to the Bartz case and your potential inclusion.
  2. Authors Guild-backed class action:
    • Visit the Authors Guild information page to learn more, check eligibility details, and register for updates.
    • You do not have to be an Authors Guild member to receive notifications.

Signing up doesn’t commit you to joining the lawsuit or taking any action later. It simply keeps you in the loop so you can make informed decisions as the case moves forward.

Key Takeaways and Timeline

The Bartz v. Anthropic lawsuit is one of the most important copyright cases of the AI era, especially for authors. Here’s what you need to know at a glance:

Key Takeaways

  • Bartz v. Anthropic is the first major AI copyright lawsuit heading to trial.
  • Judge Alsup ruled that training on pirated books is not fair use.
  • Training on books Anthropic legally bought or licensed is considered fair use.
  • The case has been certified as a class action, meaning thousands of authors could be affected.
  • The unified trial for both the Bartz plaintiffs and the Authors Guild-backed class begins December 1, 2025.
  • Potential damages could reach into the billions of dollars.
  • If you’ve published a book, it’s worth signing up to stay informed.

Timeline of Key Events

DateEvent
Oct 2023Bartz v. Anthropic lawsuit filed
Jun 2025Judge Alsup issues split ruling on fair use
Jul 2025Class action certification granted
Dec 1, 2025Unified trial begins

Final Thoughts

AI is changing publishing faster than most of us expected, and Bartz v. Anthropic is one of the first cases deciding how far those changes can go.

If you’ve published a book, your work could already be part of this fight (even if you never gave permission). The safest move is to stay informed so you know where you stand and what your options are.

So:

This way you won't be left out of the loop while this landmark case unfolds.



Sell more books on Amazon

authors_guide_to_amazon_visibility_ebook_hardcover_compressed
Free Download

Amazon Kindle Rankings E-Book

Learn how to rank your Kindle book #1 on Amazon with our collection of time-tested tips and tricks.