熊节:斯诺登警告,OpenAI已卸下伪装……
Guan Cha Zhe Wang·2025-11-26 01:16

Core Viewpoint - A U.S. federal court has ordered OpenAI to "preserve and isolate all user activity records," raising significant privacy concerns as over 300 million users' conversations with ChatGPT are indefinitely stored, even if users attempt to delete them [1] Group 1: Legal and Privacy Implications - The court's ruling stems from a copyright lawsuit filed by The New York Times against OpenAI, suggesting that users who delete data may be seen as potential copyright infringers [1] - OpenAI's CEO, Sam Altman, expressed concerns that this decision undermines user privacy and sets a troubling precedent [1] - The stored conversations may contain sensitive information, including medical symptoms and personal secrets, which could be at risk of being sold, hacked, or disclosed to law enforcement [1] Group 2: Corporate Response and Data Security - Major corporations, including Samsung, Apple, Amazon, and others, are restricting employee use of ChatGPT due to fears of data leaks, with nearly 70% of companies blocking the tool to protect confidential information [1][2] - The incident involving Samsung employees leaking confidential source code through ChatGPT highlights the potential risks associated with using the platform [1] Group 3: Surveillance Concerns - Edward Snowden has warned that ChatGPT represents a more powerful surveillance tool than the NSA's PRISM program, as it collects comprehensive user data rather than just metadata [5][6] - OpenAI has acknowledged its monitoring of user conversations, claiming it is necessary to combat "malicious use," but this raises questions about the extent and nature of the surveillance [7][8] - The criteria for determining what constitutes a threat to others remain ambiguous, leading to concerns about potential misuse of the monitoring capabilities [9]