YouTube Says Its Music AI Incubator Will 'Protect' Artists
YouTube is embracing the future of artificial intelligence in the music industry by creating a YouTube Music AI Incubator to “protect” artists and their work, the company said in a press release on Monday. The streaming platform partnered with Universal Music Group (UMG) to launch the Music AI Incubator, working alongside artists and songwriters including Rosanne Cash, Ryan Tedder of OneRepublic, and the estate of musical legend Frank Sinatra, among others.
“The incubator will help inform YouTube’s approach as we work with some of music’s most innovative artists, songwriters, and producers across the industry,” YouTube CEO Neal Mohan said in the press release. A continuing concern is how artists’ work will be protected, but YouTube says it’s using Content ID, a rights management system, and policy, detection, and enforcement systems to help ensure artists have choices about how their work is used and the option to get paid when their work is appropriated by others.
UMG Chairman and CEO, Sir Lucian Grainge, said in a separate press release: “Central to our collective vision is taking steps to build a safe, responsible, and profitable ecosystem of music and video—one where artists and songwriters have the ability to maintain their creative integrity, their power to choose, and to be compensated fairly.”
Several artists appear to be accepting AI, rather than fighting against it, seeking instead to influence how their content is used and ensuring they’re paid for their likeness. “This is about having the option to design how their music is actually used,” Grainge told The Wall Street Journal. “Artists have never had that before, to this extent, leaning into a new technology.”
YouTube’s Music AI Incubator comes as artists’ have complained about the technology that has been used to create deepfake impersonations, such as the cloned duet of Drake and The Weeknd which went viral earlier this year. The song, Heart on My Sleeve was removed from streaming platforms following a complaint from UMG, which represents the artists. In response, UMG filed a copyright infringement complaint against YouTube and other streaming platforms, which said the platforms have a “legal and ethical responsibility to prevent the use of their services in ways that harm artists,” the WSJ reported.
UMG has previously argued that AI needs to be regulated and has pushed for copyright to be applied to artists’ voices that are replicated to make deepfake music. While fighting to get AI-generated songs taken down from streaming platforms, UMG said in April that it has been moving forward with its own AI technology innovations. “However, the training of generative AI using our artists’ music … begs the question as to which side of history all stakeholders in the music ecosystem want to be on,” UMG told CNN.
But Grainge says he believes there shouldn’t be concerns about whether AI will overtake the music industry, ultimately replacing human artists. “AI will never replace human creativity because it will always lack the essential spark that drives the most talented artists to do their best work, which is intention,” he said in the press release. Grainge added, “From Mozart to The Beatles to Taylor Swift, genius is never random.”
But the question of whether or not geniuses will get paid for their work in the future remains open. YouTube’s content ID system can be heavy-handed with its automated takedowns—something that pleases rights holders. It can also be gamed by copyright trolls and has shown itself to lack a certain amount of nuance. At one point, a single man’s video upload of white noise was hit with five copyright claims. The job of sorting out what is or is not AI-generated music and who should get paid for it will likely be a much more daunting task for years to come.