Workflow
AI语音克隆技术
icon
Search documents
全红婵孙颖莎王楚钦带货土鸡蛋?用“AI盗声”牟利该担何责?
Huan Qiu Wang Zi Xun· 2025-08-24 01:54
Core Viewpoint - The misuse of AI voice cloning technology to impersonate Olympic champions for selling products on social media has raised significant concerns among the public and legal experts [1][3][5]. Group 1: Incident Overview - AI technology has been used to clone the voices of Olympic champions such as Quan Hongchan, Sun Yingsha, and Wang Chuqin to promote agricultural products on short video platforms [1][3]. - A specific case involved a self-media account that published 17 videos using AI to mimic Quan Hongchan's voice, achieving over 11,000 likes on one video, with 47,000 units of the promoted product sold [1][3]. Group 2: Legal Implications - The unauthorized use of AI to impersonate Olympic champions infringes on their rights, including name rights, voice rights, and portrait rights, as outlined in the Civil Code [6]. - If the impersonation leads to defamation or the sale of counterfeit products, the infringer may face civil liabilities, including compensation for damages and public apologies [6]. Group 3: Consumer Rights - Consumers misled by AI-generated content can file complaints with the broadcaster, platform, or merchant, and if unresolved, can escalate to consumer associations or legal action [7]. - Evidence such as video recordings of the purchasing process can aid consumers in claiming refunds or compensation, especially if they receive counterfeit products [7]. Group 4: Platform Responsibilities - Short video platforms must verify the identities of broadcasters and ensure they have authorization to use AI-generated content featuring celebrities [8]. - Platforms are required to implement mechanisms for quickly identifying and removing infringing content, and failure to do so may result in shared liability with the infringer [9].