Oh, Oh… Can AI Really Use Our Work for Training? A Big New Court Ruling Says... Maybe.
A Recent Court Decision Supports AI Training on Copyrighted Work—But the Legal Story Isn’t Over
If you’ve been wondering whether companies can legally use your photos to train AI models without your permission, a recent court decision by California federal judge William Alsup just made the conversation a little more complicated.
Let’s try to break it down.
What Happened?
Authors recently took on Anthropic, the company behind the Claude AI models (kind of like ChatGPT’s cousin), accusing them of using copyrighted books to train their AI without permission. The authors argued this was a clear copyright violation. Anthropic argued it was legal.
The judge agreed—sort of.
Training AI on copyrighted books? The court said that’s okay.
The judge ruled that using these books to train the AI was considered “fair use” because the AI wasn’t copying the books for the purpose of selling or distributing them. Instead, the court saw it more like “reading to learn.” It’s not about the content being repeated, but about the AI learning patterns and structures.Using pirated copies? Not okay.
Anthropic didn’t get a full pass. Part of their training material originated from millions of pirated books circulating on the internet. The court made it clear: using stolen material, even for training, is still wrong. Anthropic will face trial later this year to determine what penalties it may owe for that.
Why Photographers Should Pay Attention
Even though this case focused on books, it’s part of a much bigger fight about whether AI companies can scrape and use all kinds of creative work—including photographs, music, paintings, movies, and more—to build their models.
Here’s what matters for us:
Training is treated differently from copying outputs.
This case is only about training AI models. Teaching an AI to "learn" from your photo might be allowed under fair use, but if the AI spits out something too close to your original work, that could still be a problem. This case didn’t settle that part.Where the training data comes from matters.
If a company pulls your photo from legitimate sources, the court might see that differently than if it came from pirated sites or unauthorized archives. Legal sourcing still counts.Photography isn’t off the hook.
This case was about text. Visual art, including photography, could eventually be treated differently. Judges may weigh whether an AI-generated image appears too similar to someone’s original photo or harms the photographer’s market. The fight isn’t over.
Other Important Lawsuits Still in Play
This isn’t the final decision on all this. Several other significant cases are still in motion, including:
NY Times v. OpenAI and Microsoft
The Times says its articles were used without permission to train AI models. The case is moving forward, and it could set a new precedent.Kadrey v. Meta and OpenAI
A group of authors is suing over the use of their books in training AI models. It’s early, but the outcomes here could shift the conversation.Ross Intelligence v. Thomson Reuters
This earlier case said it was not fair use when AI-generated outputs copied too much from original legal summaries. So the courts aren’t always siding with AI companies.
Bottom line: This ruling will definitely be appealed. Nothing is settled for good.
What This Means for Photographers (Right Now)
Takeaways for Our Photography Community
This ruling leans in favor of AI companies being able to train on existing work, but it doesn’t give them free rein to do whatever they want.
For photographers like us:
It’s a reminder to stay informed and keep protecting our work.
The biggest fights will probably be about outputs, especially when AI-generated images are so close to someone’s original photo that you can’t tell the difference. It’s happened in the past and will likely happen again.
This is just one chapter in a much longer story. Appeals are coming. Other cases are working their way through the courts, including some that are laser-focused on visual art.
While this decision is a step in one direction, it’s not the final word.
Where That Leaves Us
If you’re feeling frustrated or uncertain, you’re not alone. These are complex questions that courts, photographers, other creatives, and AI companies are still grappling with. For now, it appears that training on legally sourced photos may stand, but how that training is used, and what happens when the AI’s output resembles yours, are battles still ahead.
We’ll continue to monitor these cases and share updates that are relevant to our photography community.