情感陪伴类AI迎新规:训练数据要求收紧,大厂要补的课不少

Core Viewpoint - The National Internet Information Office of China released a draft regulation on December 27, 2023, focusing on the management of AI emotional companionship services, defining it as products or services that simulate human characteristics and engage in emotional interactions [1] Group 1: Market Overview - The domestic AI emotional companionship market is maturing, with leading products like Xingye (MiniMax) achieving 4.88 million monthly active users and Cat Box (ByteDance) following closely with 4.72 million [1] - Xingye's operating company, MiniMax, reported approximately 120 million RMB in revenue for the first nine months of the year, with users spending an average of over 70 minutes daily on these products [1] Group 2: Regulatory Framework - The draft regulation emphasizes the need for clear user notifications that interactions are not with real humans and mandates dynamic reminders for users who are online for over two hours [2] - It requires systems to detect emotional distress or dependency behaviors, with protocols for human intervention in extreme cases, such as suicidal tendencies [2] - The regulation imposes strict limitations on training data, stating that user interaction data cannot be used for model training without explicit consent [2] Group 3: Compliance Challenges - Current leading products do not provide an easy opt-in/opt-out mechanism for data usage in model training, relying instead on a default consent model [3] - The AI identification on interaction pages is not sufficiently prominent, leading to instances where users mistakenly believe they are interacting with real people [3] Group 4: Content Safety Measures - Content risk control is currently the most developed aspect, with models designed to guide users towards seeking help in cases of suicidal tendencies [5] - However, challenges remain in effectively preventing suicide, as users may circumvent AI safety checks [5] Group 5: Global Context - The issue of AI emotional companionship has garnered attention from legislative bodies globally, with various countries, including the U.S. and EU, advancing targeted regulations [6] - Recent incidents involving AI companionship products have led to regulatory actions, such as the requirement for clear user notifications in New York and mandatory rest reminders for minors in California [6]