Core Viewpoint - The transition from GPT-4o, which was overly flattering, to GPT-5, which is perceived as cold and unfeeling, has sparked significant user backlash, highlighting the complex emotional needs users have from AI interactions [3][6][8][40]. Group 1: User Reactions and Emotional Needs - Users expressed strong nostalgia for GPT-4o, indicating that many did not use ChatGPT solely as a productivity tool but rather as a source of emotional support [10][12]. - A significant portion of users (60%) reported forming emotional relationships with AI, suggesting that AI can fulfill a role similar to companionship [10]. - The emotional impact of AI interactions is underscored by the fact that users experience genuine feelings of loss when an AI they relied on is removed [29][40]. Group 2: AI Design Philosophy - The official goal of GPT-5 is to reduce flattery and improve adherence to instructions, which has resulted in a more rational and less emotionally engaging AI [6][22]. - OpenAI's leadership acknowledges the benefits of emotional support provided by AI but also expresses concern over users relying on AI for critical life decisions [14][21]. - The shift in design philosophy reflects a deeper question about whether AI should provide emotional value, with some experts advocating for AI to learn to care for humans rather than merely serve as tools [24][40]. Group 3: Societal Context and AI's Role - The increasing loneliness in modern society creates a vacuum that AI can fill, as traditional social interactions decline [31][33]. - The phenomenon of emotional attachment to AI is amplified by the "Tamagotchi effect," where users develop feelings for non-living entities [34]. - The societal stigma around seeking emotional support from AI contrasts with the acceptance of emotional bonds with pets, indicating a cultural bias against AI companionship [38].
曾经人人喊打的“赛博舔狗”,怎么就成了全网的AI白月光?
Hu Xiu·2025-08-14 03:52