The industry is using its political influence to push for a loose regulatory framework, and regulators and policymakers are discussing the effects of AI at the time of the favorable ruling.
According to US District Judge William Alsup, “Anthropic’s LLMs]large language models] trained upon works not to race ahead and copy or supplant them, but to turn a hard corner and create something new,” as a reader aspiring to be a writer.
A group of authors had filed a class-action lawsuit against Anthropic, alleging that their use of their work to train its chatbot, Claude, without their consent was against the law.
However, Alsup claimed that the AI system had not violated US copyright laws’ provisions that “support creativity and advance scientific progress” in any way.
He accepted Anthropic’s claim that the AI’s output was “exceedingly transformative” and therefore fell under the “fair use” rules.
However, Alsup did rule that Anthropic’s “central library”‘s copying and storage of seven million pirated books violated author copyrights and did not constitute fair use.
Tech companies have used generative AI to create the fair use doctrine, which only permits the use of copyrighted materials for creative purposes. To train their AI models, technology developers frequently scavenge through significant amounts of previously undeveloped data.
Despite this, there is still a heated debate about whether AI will encourage greater artistic creativity or allow for mass production of cheap imitations that make artists obsolete for the benefit of large corporations.
Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson, the lawsuit’s authors claimed Anthropic’s practices amounted to “large-scale theft” and that the business had sought to “profit from strip-mining the human expression and ingenuity behind each of those works.”
Although Alsup ruled that Anthropic must still go on trial in December for the alleged theft of pirated works, despite the fact that Tuesday’s decision was viewed as a victory for AI developers.
Source: Aljazeera
Leave a Reply