转化性使用

Search documents
AI“读书”合法了:美法院最新裁定,无需作者同意,已购书籍可用于训练AI
量子位· 2025-06-26 03:43
Core Viewpoint - The recent U.S. court ruling allows AI companies like Anthropic to use legally purchased books for training AI without needing the authors' permission, citing "transformative use" under the Fair Use principle, which promotes technological innovation and public interest [2][3][14]. Group 1: Court Ruling Details - The court's decision marks the first recognition of AI companies' rights to use books, significantly reducing copyright risks associated with AI training data [3]. - The ruling specifies that while the use of legally purchased books for AI training is permissible, the use of pirated books does not qualify as fair use and remains subject to copyright infringement claims [15][17]. - The case originated from accusations by three authors against Anthropic for using both legally purchased and pirated books to train their AI model, Claude [6][13]. Group 2: Background on Anthropic - Anthropic's co-founder Ben Mann downloaded 196,000 copyrighted books from a piracy site in 2021 and later amassed at least 5 million copies from other sources [7][8]. - Despite recognizing the legal risks of using pirated content, Anthropic retained all pirated copies until March 2023, when they began training Claude with a subset of books from their digital library [9][10]. - In February 2024, Anthropic shifted to legally procuring and scanning books, purchasing millions of physical copies [11]. Group 3: Implications and Reactions - The ruling has sparked discussions about whether AI can be equated with human reading and understanding, and how creators can protect their intellectual property [19]. - Similar cases in the past, such as Google Books and GitHub Copilot, have set precedents for the application of fair use in AI training, indicating a trend in favor of technological innovation over copyright restrictions [23][32]. - The outcome of this case may influence ongoing litigation involving OpenAI and Meta, as it reflects a judicial inclination towards supporting AI companies in their use of copyrighted materials [34].