Anthropic has partially resolved a legal disagreement that caused the AI startup to anger the music industry. In October 2023, a group of music publishers, including Universal Music and ABKCO, filed a copyright infringement complaint against Anthropic. The group claimed that the company had trained the Claude AI model on at least 500 songs to which it held the rights, and that, if elevated, Claude could replicate some or all of the lyrics of those songs. Beyoncé’s “Halo” and Maroon 5’s “Moves Like Jagger” were among the lyrics that the publisher claimed Anthropic infringed.
In court-approved provisions agreed to by both sides on Thursday, Anthropic will maintain existing guardrails against output that reproduces, distributes or displays copyrighted material owned by publishers and when training future AI models. agreed to introduce similar measures.
At the same time, the company said it would respond “quickly” to copyright concerns from the group and promised a written response detailing when and how it plans to address the concerns. If a company intends not to address an issue, it must clearly state that intention.
“Claude was not designed to be used for copyright infringement, and we have numerous processes in place designed to prevent such infringement,” an Anthropic spokesperson told Engadget. Ta. “Our decision to enter into this provision is consistent with these priorities.” We look forward to demonstrating that using it to train models is typical fair use. ”
As noted above, Thursday’s agreement does not fully resolve the initial differences between Anthropic and the music publishing group that sued the company. The latter continues to seek an injunction against Anthropic to prevent it from using unauthorized copies of its lyrics to train future AI models. A ruling on the matter is likely to be issued in the coming months.
If you buy something through links in this article, we may earn a commission.