Spotify is completely unserious about combating AI music
Their new policies on AI music do little to protect musicians.
Yesterday, Spotify announced that it was “Strengthen[ing] AI Protections for Artists, Songwriters, and Producers”. This is spin. In reality, it is far behind other platforms on combating AI music, and its announcement is notable primarily for what it leaves out - that is, what Spotify continues to allow.
Let’s look at what it says.
It is employing three tactics to protect musicians against “the worst parts of Gen AI”. These are:
Stronger impersonation rules
Music spam filter
AI disclosures for music with industry-standard credits
These are designed to sound good - but in reality they represent the absolute minimum Spotify could be doing, and they seem specifically designed to continue to allow the type of AI music that musicians actually fear.
Let’s look at each in turn.
1. Stronger impersonation rules
Basically, Spotify will let you issue takedown requests if your voice is cloned, and will try to stop people uploading AI music to your artist page.
To be clear, this should be absolute table-stakes. Voice cloning without permission will almost certainly be illegal anyway when the law catches up with the technology, if it isn’t already. What Spotify is describing here is fraud - and their bare minimum duty should be to identify and stop fraud.
So yes, it’s good that they’re doing this. But they deserve no particular applause for doing so - it would be shocking if any music streaming platform didn’t do this.
2. Music spam filter
They will introduce a new spam filter that identifies “tactics such as mass uploads, duplicates, SEO hacks, artificially short track abuse, and other forms of slop”.
But they already filter spam. They have been doing so for years. Again, this is table-stakes for music streaming platforms. Critically, they don’t define fully AI-generated music as spam.
This ‘new music spam filter’ is included in a post about AI, but it seems to have very little to do with AI. Fully AI-generated music will still be allowed through, as long as you’re not doing things like spamming Spotify with thousands of tracks.
3. AI disclosures for music with industry-standard credits
They say:
We’re helping develop and will support the new industry standard for AI disclosures in music credits … This standard gives artists and rights holders a way to clearly indicate where and how AI played a role in the creation of a track.
There are two key issues with this:
It is voluntary. Uploaders are not required to disclose AI usage. Many, of course, won’t.
It will be virtually invisible. No one reads the track credits - which is where this will show up.
They could do so much better here. They could scan uploaded music for AI output (in fact, it’s almost certain they already do, though they don’t make the results public), which would mean they could label AI music whether or not people disclosed their use of AI; and they could make the AI music label actually visible.
What Spotify still allows
It’s important to read between the lines of the announcement, as doing so reveals what Spotify will still allow:
It allows fully AI-generated music to continue being uploaded, and recommended to users, with no clear label attached.
Remember The Velvet Sundown? Still allowed.
So anyone can use AI music models - some of which are trained on other musicians’ work without permission - to make fully AI-generated music, upload this to Spotify, and take royalties from the pool available to human musicians.
Spotify has a financial incentive to allow this, of course. They make more money from music that isn’t owned by the major labels. So a greater share of AI music on the platform, in general, will increase their profits.
But it’s really sad to see Spotify’s continued slide away from being a platform for human music. I for one recently switched to using Deezer, whose policies on AI music are much more musician-friendly. I encourage others to do the same.


I’m interested to see how publishers and PRO’s evolve their policies to deal with copyright attribution on Ai music, whether partially or fully AI composed / produced and how they draw the distinction. Mentioning because I wonder if streamers inclination to allow AI music on their platforms is weighted to the possibility it might reduce their publishing royalty burden in future? Most discussions around AI music right now deal with the legitimacy of training, which is a key rights issue, but in future, the question of who claims copyright on machine authored lyrics or melodies might become a larger tangled ball of wool ?
Instead of paying Spotify for fake music I give to my local NPR stations! CPR Classical and KUVO.org for jazz, blues, Latin Soul Party, R&B Jukebox! And local djs, local culture and news, Colorado musicians! Real music!