Workflow
虚拟陪伴类AI产品
icon
Search documents
美国FTC要求七家AI巨头说明青少年安全保障措施
Huan Qiu Wang Zi Xun· 2025-09-13 03:49
Group 1 - The FTC has issued investigation orders to seven AI chatbot companies, including OpenAI, Meta, Snap, xAI, Alphabet, and Character.AI, to assess the impact of "virtual companion" AI products on children and teenagers [1][3] - The investigation focuses on the companies' profit models, user retention strategies, and measures taken to mitigate user risks, emphasizing that this is a research project rather than an enforcement action [3] - The FTC aims to balance the need for "protecting children's safety" with maintaining the U.S.'s global leadership in AI, with a requirement for companies to respond within 45 days [3] Group 2 - Recent reports of youth suicide incidents have heightened societal concern regarding the impact of AI products, prompting the FTC's investigation [4] - Legislative efforts are underway in various U.S. states, such as California, to establish safety standards for AI chatbots and enhance corporate accountability regarding minors' rights [4] - The FTC has indicated that if the investigation reveals illegal activities by the companies, further enforcement actions may be taken to protect vulnerable populations, including teenagers [4]