Streaming service Deezer ran an experiment recently, with the help of research firm Ipsos. The finding — that 97 percent of people can’t tell the difference between fully AI-generated and human-made music — was alarming. But it’s also not the whole story.
In the survey, 9,000 participants listened to three tracks and were asked to guess which, if any, were completely AI-generated. If the participant failed to guess all three correctly, they were put in the fail pile. That means if you got two of three correct, Deezer and Ipsos still said you couldn’t tell the difference between fully AI-generated music and the real deal.
Deezer sent me the three tracks it used in the study, and so I decided to run my own (less scientific) experiment. I had 10 people listen to the same tracks and gave them the same prompt. People did have trouble identifying which songs were fully AI. Only one person got all three right. But if I didn’t bundle the responses, the results were much less dire. People were able to successfully identify whether a track was AI or human-generated 43 percent of the time.
It’s also worth noting that several people told me one of the songs was so terrible, so obviously AI, that they thought it had to be a trap and guessed it was real.
Unsurprisingly, participants in Deezer’s study were a little caught off guard by how poorly they performed. Seventy-one percent were surprised by the results, and 51 percent said it made them uncomfortable to not be able to tell the difference between AI- and human-created art.
Opinions on the impact were split, with 51 percent believing that AI will lead to the creation of “more low-quality, generic sounding” music. Somewhat shockingly, only 40 percent said they would skip AI music without listening if they knowingly came across it.
One area where most agreed, however, was in the need for transparency. Eighty percent want AI-generated music to be clearly labeled. Right now, that has been Deezer’s approach. It has created a system that can automatically detect and label 100 percent AI-generated content from the most popular models like Suno and Udio. Deezer also excludes music that has been labeled as AI from its algorithmic recommendations.
Spotify recently announced steps to combat AI slop on its platform, but stopped shy of saying it would explicitly label AI content. It announced policies regarding AI impersonation and a new spam filter that should keep many of the worst actors off its platform. But instead of blanket labeling, it’s working toward a standardized credits system, saying, “The industry needs a nuanced approach to AI transparency, not to be forced to classify every song as either ‘is AI’ or ‘not AI.’” That system, however, would rely almost entirely on labels and artists honestly disclosing when songs use AI, even if it’s simply to help in the mixing process.
Manuel Moussallam, director of research at Deezer, tells The Verge that there is a bit of a gray area around hybrid content that might use AI elements. But he says this is “not a technical problem. It’s a transparency issue and it’s an ethical issue” that will require all parties involved, from the creators to the music distribution services like DistroKid to the streaming platforms, to act responsibly.
What is clear is that the amount of AI-generated music being uploaded is staggering, and only increasing. Deezer says that it receives over 50,000 AI-generated tracks per day, which accounts for more than 34 percent of music added to the service.
... continue reading