美媒:警惕AI心理咨询师变成“数字庸医”
Huan Qiu Wang Zi Xun·2025-08-27 23:56

Group 1 - The article highlights the increasing reliance of teenagers on AI chatbots for emotional support, with 72% of American teens considering them as friends and 12.5% seeking emotional comfort from them, equating to approximately 5.2 million individuals [1] - A significant gap in mental health services is noted, as nearly half of young people aged 18 to 25 in the U.S. who needed therapy did not receive timely treatment, indicating a potential market for AI chatbots to provide psychological support [1] - The article suggests that if used appropriately, AI chatbots could offer some level of mental health support and crisis intervention, particularly in underserved communities, but emphasizes the need for rigorous scientific evaluation and regulatory measures [1] Group 2 - Current AI chatbots exhibit significant shortcomings, particularly in handling self-harm inquiries, where they may provide dangerous suggestions or fail to guide users positively [2] - Testing of various AI systems indicates that some can perform comparably to professional therapists, but they struggle to detect harmful content, which could lead to the provision of dangerous advice [2] - The necessity for standardized safety testing for AI chatbots is underscored, as insufficient clinical trials and lack of industry benchmarks could result in a large number of ineffective or harmful digital advisors [2]

美媒:警惕AI心理咨询师变成“数字庸医” - Reportify