Spotify is causing quite a stir after reports revealed that it published AI-generated songs on the official pages of musicians who passed away years ago, without obtaining permission from their estates or labels. This raises some serious questions about respect, rights, and what’s next for music streaming.
One glaring example is Blaze Foley, a Texas country singer-songwriter who died in 1989. An AI track called ‘Together’ popped up on his official Spotify page on July 14. The song featured AI-made cover art and sounded like a current country ballad, but it didn’t match Foley’s original style at all.
Craig McDonald, who owns Lost Art Records (the label behind Foley’s music and Spotify page), told 404Media that the track was “nowhere near Blaze’s style” and called it “an AI schlock bot.” He said the song “has the authenticity of an algorithm” and confirmed his label had nothing to do with releasing it.
McDonald flagged the song over the weekend and reached out to their distributor, Secretly Distribution, but hasn’t heard back yet. He’s urging Spotify to put in safeguards that stop content from appearing on an artist’s official page without permission.
“It’s harmful to Blaze’s standing that this happened,” McDonald said. “It’s kind of surprising that Spotify doesn’t have a security fix for this type of action, and I think the responsibility is all on Spotify. They could fix this problem.”
AI Tracks Linked to TikTok’s SoundOn and Unknown Distributor
Spotify later admitted the song broke its Deceptive Content policy and removed it after it was flagged. The track was distributed through SoundOn, a TikTok-owned platform that allows users to upload music directly to TikTok and other services. TikTok stated that both the song and the uploader’s account were removed.
This Blaze Foley AI track wasn’t alone. Another song titled “Happened To You” appeared last week on the Spotify page of Grammy-winning country artist Guy Clark, who died in 2016. A third track, “with you” by Dan Berk, also showed the same copyright tag reading “Syntax Error.” Reality Defender, a company specializing in deepfake detection, stated that all three tracks exhibited signs of AI-generated content.
No verified company named Syntax Error seems to be behind these uploads, and Dan Berk hasn’t commented yet.
AI music on Spotify isn’t totally new; some acts like Velvet Sundown have openly embraced it, but this new wave is different because it ties unauthorized AI songs directly to the legacies of real artists who are no longer here. How do you feel about AI tracks being linked to musicians who can’t approve or deny their use? It’s kinda unsettling, right?
What’s Spotify’s next move in handling this mess? Will they tighten controls or continue to let AI content slip through? I’m curious to see how this will affect trust in streaming platforms overall.