Deezer, the global music streaming platform based in France, claims to have developed a technique for flagging — and potentially deleting — songs that use artificial intelligence to simulate the performance of popular singers. “We need to take a stand now,” Deezer CEO Jeronimo Folgueira said in an interview. “We are at a pivotal moment in music.” His company plans to “weed out illegal and fraudulent content” in an effort to protect artists. Deezer’s detection technology is still under development. It relies on AI, which Folgueira said he is not against if it is used ethically.
“Personally, I would love to see AI bring Whitney Houston back to life, and come up with amazing new tracks with her voice,” Folgueira told the BBC. Although he did not go into detail as to how the filter works, the basic gist is it uses audio analysis powered by machine learning.
The model is trained on a dataset of songs tagged as “authentic” or “fake.” The system can then look for things like inconsistencies in pitch or timbre or detect the presence of sound artifacts that are typical of AI.
Folgueira told the BBC his company also uses its detection system “to root out ‘fake artists’ — pseudonymous musicians with no discernible online presence, who create hundreds of songs (typically overly-simple mood music) to game the royalty payment system.” Such copycats were spotted as early as 2017, and are still being written about.
“We eliminate a huge amount of artists and a huge amount of fake streams every day — and the trend is that this fraudulent behavior just keeps accelerating,” Folgueira said, noting Deezer has been “working with Universal to change the way royalty payments are calculated.”
Universal Music Group is another company deeply interested in tamping down deepfake music. In April, UMG filed takedown orders to have the song “Heart on My Sleeve” — which was generated by AI in the style of two of its biggest artists, Drake and The Weeknd — removed from streaming services. UMG claimed the cloned tune violated copyright, but it was on the basis of an unauthorized sample used in the song’s intro. The larger issue of AI cloning an artist’s voice has yet to be legally tested.
But not everyone is opposed. Grimes is offering 50 percent royalties to creators who use her voice in AI-generated songs, according to TechCrunch, which says the Portland-based pop group Yacht “trained an AI on 14 years of their music, then synthesized the results into the album ‘Chain Tripping.’”
AI Could Usher in a New Era of Music, Wired, 6/6/23
How AI is Transforming Music Creation in Web3, CoinDesk, 6/6/23
73% of Producers Think AI Music Generators Could Replace Them, MusicTech, 6/2/23