YouTube is quickly becoming a home for AI-generated music, and the service is trying to strike a balance between the technology’s fans and the labels eager to protect their copyrights. The company and partners like Universal Music Group (UMG) have unveiled a set of principles for AI music. In theory, the approach encourages adoption while keeping artists paid.
To start, YouTube maintains that “AI is here” and that it must have a “responsible” strategy. Accordingly, it’s forming a Music AI Incubator that will influence the company’s strategy. UMG and artists it represents (including Rosanne Cash, Yo Gotti and Frank Sinatra’s estate) will help gather insights from YouTube’s AI experiments.
YouTube also says AI music must include “appropriate protections” against copyright violations, and must also provide “opportunities” for partners who want to get involved. While the video giant hasn’t detailed what this will entail, it suggests it will build on the Content ID system that helps rights holders flag their material. On top of this, YouTube claims it will scale its content policies and safety structure to adapt to AI. The firm already has systems in place to catch copyright abuse, misinformation and other violations, but intends to pour more resources into those methods.
The principles are currently vague and don’t do much to change YouTube’s stance. More details are due in the months ahead, however, including policies, particular technologies and monetization for creators.
Generative AI is increasingly popular for unauthorized collaborations and mashups (including for UMG artists like Drake and Frank Sinatra), but it’s also finding legitimate uses. The surviving members of The Beatles are using AI to create a ‘final’ song from a John Lennon recording, while electronic artist Holly Herndon covered Dolly Parton using an AI voice. UMG itself is exploring AI-made soundscapes. YouTube’s principles could help it profit from legal productions while dodging lawsuits from artists and labels worried about ripoffs.