Workflow
36氪精选:AI的24小时心理陪伴,可能是一场「灾难」|36氪专访
日经中文网·2025-11-15 00:33

Core Viewpoint - The article discusses the evolving relationship between AI and mental health support, questioning whether AI can outperform human therapists in providing emotional assistance and exploring the implications of this shift for the mental health industry [5][8]. Group 1: AI's Role in Mental Health - AI is increasingly being used as a temporary emotional support tool, with over a million users on Chat GPT expressing suicidal tendencies or emotional attachment [5]. - Compared to traditional therapy, AI offers low-cost, 24/7 access, making emotional support more accessible [5][6]. - The emergence of AI could bridge the gap between professional therapy and emotional companionship, creating a new service paradigm [7][8]. Group 2: Development of AI in Therapy - Simple Psychology, founded by Jian Lili, has integrated AI into its services, initially focusing on enhancing the skills of human therapists rather than commercializing AI products [11][12]. - The AI assistant launched by Simple Psychology has shown that 70% of users engage with it for emotional support, indicating a willingness to use AI for mental health needs [16][17]. - Users perceive AI as a stable and trustworthy entity, often more so than human therapists, due to the controllable nature of AI interactions [17][18]. Group 3: Limitations and Future Directions - While AI can perform well in short consultations, it struggles with the long-term engagement required in traditional therapy settings [18][20]. - The traditional structure of therapy, which includes fixed times and safe spaces, is challenged by the 24/7 availability of AI, necessitating a redefinition of how AI therapy is structured [20][21]. - Future AI products in mental health are expected to evolve into a new paradigm, potentially complementing human therapists rather than replacing them [28][34]. Group 4: Ethical Considerations and Market Dynamics - The mental health industry faces challenges in integrating AI, including ethical discussions and the need for regulatory frameworks to ensure user safety [36]. - There is a concern that AI could exacerbate existing issues in mental health care, particularly for vulnerable populations, highlighting the need for careful implementation [36]. - The market for AI in mental health is expected to grow, with a significant portion of users likely to prefer AI support over traditional therapy due to accessibility and cost [34][35].