BereaOnline.com Name Logo, Blue Letters

If Song Lyrics Are Everywhere Online, Why Is AI Still in Trouble for Using Them?

BEREA, Ky. — This is one of those AI fights where a lot of regular people are going to look at the lawsuit and say, “Come on now.”

If you want the lyrics to almost any hit song in history, you can usually find them online in seconds. So when massive music publishers sue AI companies like Anthropic, claiming their chatbot Claude is infringing copyright by reproducing lyrics, the instinctive reaction from the public is easy to understand: How can something be legally off-limits if it is already plastered all over the internet?

The answer, at least legally, is that “easy to find” does not mean “free to reuse.” Copyright law protects musical compositions, including lyrics, and gives copyright owners exclusive rights over those works. A song can appear on a thousand different websites and still be copyrighted the entire time. The internet did not repeal ownership just because it made copying effortless.


The Case Against Anthropic

That tension is the heart of the case against Anthropic. Music publishers, including Universal Music Group, Concord, and ABKCO, sued the company late in 2023, alleging that Claude used lyrics from at least 500 songs without permission. The publishers argue that Claude’s lyric outputs are not “fair use” and that the system reproduces protected lyrics on demand, actively competing with and diluting the market for licensed uses of those works.

But this is where the story gets more interesting than the internet hot-take version.

“Knowing lyrics” and “reusing lyrics” are not the same thing. A model that has learned enough from training data to discuss a song’s cultural impact, identify its themes, or talk about the artist is one thing. A model that can hand a user the verbatim lyrics themselves, closely enough to completely substitute for the original licensed source, is something else entirely. That distinction is starting to matter more and more in court.


Finding the Middle Ground

Despite the all-or-nothing framing usually found online, there is middle ground here. This is not really a binary choice between “copyright law is dead” and “every AI company is cooked.”

That middle ground is already visible in how the case is playing out. In an earlier stage of the lawsuit, a federal judge actually denied the publishers’ request for a preliminary injunction. This was not because Anthropic had won the whole argument, but because the publishers had not shown the kind of immediate, irreparable economic harm needed for an emergency shutdown. Anthropic also agreed to maintain strict internal guardrails aimed at preventing Claude from generating copyrighted lyrics on demand.

That suggests the legal system is not moving toward a simple yes-or-no rule. Courts appear to be teasing apart several completely different questions:

  • Is training an AI on copyrighted material itself fair use?
  • Is storing that copyrighted material without permission unlawful?
  • Is serving protected text back to users a separate, actionable form of infringement?

Those are fundamentally different acts, and judges do not seem eager to collapse them into one sweeping, industry-killing answer. We saw similar nuance when a judge recently found that Anthropic’s use of books to train AI constituted fair use, while simultaneously maintaining that actual piracy is unjustifiable.


Availability vs. Permission

From a common-sense perspective, I think this is why the case has such a strange feel to it for the average user.

On one hand, music publishers are not crazy for defending lyrics. The lyrics are the work. If an AI tool can spit them back out verbatim on command, that starts looking less like machine learning and much more like unlicensed redistribution.

On the other hand, ordinary users are also not crazy for feeling that copyright law looks a little detached from reality when the exact same lyrics can be found instantly via a standard Google search.

The law is built entirely around permission, while the internet trained everyone to think purely in terms of availability. Those are not the same thing, and AI has now shoved that conflict right into the center of public life.


The Bottom Line

What makes this case worth watching is that it points to a much broader question about how creative work gets used in the AI era. If courts decide AI companies can ingest creative work freely to learn, but cannot reproduce it verbatim, that creates one specific future. If they decide the training process itself requires paid permission, that is a radically different one.

Finding lyrics online and having the legal right to reproduce them commercially are still two completely different things. That may frustrate people, and maybe it should. But it is the system we have.

The real question in the Anthropic case is not whether Claude has ever “seen” the lyrics to Hotel California. It is whether an AI company gets to turn protected creative work into an on-demand answer machine without paying the creators for the privilege. That is a much harder question, and a much more important one, than the internet jokes make it out to be.


About the Author

Chad Hembree is a certified network engineer with 30 years of experience in IT and networking. He hosted the nationally syndicated radio show Tech Talk with Chad Hembree throughout the 1990s and into the early 2000s, and previously served as CEO of DataStar. Today, he is based in Berea as the Executive Director of The Spotlight Playhouse, proof that some careers don’t pivot, they evolve.


Upcoming Events in Berea & Beyond

Theater & Performance at The Spotlight Playhouse

(Tickets and info for all shows: thespotlightplayhouse.com)

Music & Concerts

Community, Arts & Outdoors

BereaOnline.com: Covering Berea, KY News and Events Since 1995