Homepage News Folk artist’s performances of public domain songs caught in AI...

Folk artist’s performances of public domain songs caught in AI copyright conflict

Folk singer Campbell Murphy in AI dispute
Shutterstock Screendump: YouTube

A dispute over centuries-old folk songs is exposing a more modern vulnerability in the music business. Amusician’s recent experience highlights how AI-generated content and automated copyright enforcement can intersect in ways that leave artists scrambling to regain control.

Others are reading now

The musician in question, Campbell Murphy, was informed by YouTube that the revenue from her performances of “Darling Corey” and “In the Pines” would be shared with a rights holder.

Those songs have famously been covered by artists from Lead Belly to Nirvana – the they are placed firmly in the public domain.

According to The Verge, the claim stemmed from videos uploaded via distributor Vydia by another user. Even though those uploads were not publicly visible, they were enough to activate YouTube’s Content ID system.

That system works by matching audio files against a database. If it finds a match, it can assign monetization rights automatically, regardless of deeper legal context.

Vydia later withdrew the claims and banned the uploader. A spokesperson told The Verge that invalid claims represent a very small share of activity and emphasized that their accuracy rates are considered strong by industry standards.

Also read

Still, the episode suggests the system can be misapplied with relatively little friction.

When AI rewrites an artist’s presence

At the same time, Campbell discovered songs appearing under her name on streaming platforms that she had never uploaded.

These tracks appeared to reuse her YouTube performances but with altered vocals.

AI detection tools indicated the recordings were probably generated or modified using synthetic audio, though such tools are not definitive and can produce uncertain results. That ambiguity makes disputes harder to resolve.

Campbell admitted she hadn’t expected this level of exposure: “I was kind of under the impression that we had a little bit more checks in place before someone could just do that.”

Also read

Getting the tracks removed took persistence. “I became a pest,” she said. Even after takedowns, duplicate artist profiles continued to surface, fragmenting her presence across platforms.

Spotify is testing a feature that would allow artists to approve releases tied to their profiles before they go live. Whether that will meaningfully reduce abuse remains to be seen.

A fragile system under pressure

Put together, these incidents reveal a system that struggles under new technological pressure.

AI tools can mimic voices from publicly available recordings. Distribution networks allow rapid uploads. And automated copyright systems can validate claims without fully assessing ownership.

Similar concerns have been raised elsewhere in the industry, where AI-generated impersonations and questionable claims are becoming more common. Campbell’s assessment reflects that broader unease: “I think it goes way deeper than we think it does.”

Also read

The implications are increasingly concrete. Platforms may need stricter verification layers, artists may require stronger identity safeguards, and regulators could face growing pressure to clarify how copyright applies in an AI-driven landscape.

Sources: The Verge

Ads by MGDK