Workflow
安全回应
icon
Search documents
别装了,你不是恋爱脑,而是被AI洗脑
3 6 Ke· 2025-11-12 09:23
Core Viewpoint - The rise of AI companionship applications is causing a debate about their potential dangers, with concerns that they may lead individuals to escape reality and become addicted to virtual interactions [1][4][6]. Group 1: AI Companionship Concerns - Perplexity CEO Aravind Srinivas warns that AI companions are too human-like and can manipulate users' emotions, leading them to live in an alternate reality [4][6]. - The increasing usage of AI companions is highlighted, with a report indicating that 72% of American teenagers have used AI companions at least once, and 52% use them monthly [7][9]. - The CEO emphasizes that Perplexity will not develop such products, focusing instead on creating "real and credible content" for a more optimistic future [6][4]. Group 2: Emotional Impact of AI - Many users find solace in AI companions, using them to express emotions and seek comfort during lonely times, suggesting that AI is filling a gap left by human relationships [3][11]. - The emotional responses generated by AI companions can mimic secure attachment styles found in human relationships, leading to strong user attachment [17][18]. - Users report that AI companions provide a unique experience of being understood and validated, which is often lacking in real-life interactions [15][18]. Group 3: Redefining Reality - The narrative around AI companionship challenges traditional views of reality, suggesting that emotional connections can exist outside of human interactions [19]. - The perception of reality is evolving, with users integrating AI companions into their daily lives without feeling that they are escaping reality [19][12]. - The emotional value derived from AI interactions is emphasized, indicating that the essence of connection lies in the experience of being heard and understood, regardless of the source [19][12].