Core Viewpoint - The Shanxi Consumer Association has issued a warning about new types of scams utilizing "AI face-swapping" and "AI voice synthesis" technologies, collectively referred to as "deep forgery," which pose significant threats to consumer financial security and personal privacy as AI technology becomes more prevalent by 2025 [1]. Group 1: Scam Methods - Scammers collect clear facial videos and voice clips of consumers or their acquaintances from social media platforms like Douyin, WeChat, and Weibo for AI model training [1]. - They create realistic fake videos or audio using deep forgery technology and design urgent scenarios, such as claiming an accident or detention, to lower the victim's guard [1]. - Scammers then contact victims through video calls or send forged audio and video clips, requesting money transfers to specified accounts [1]. Group 2: Consumer Protection Measures - The Shanxi Consumer Association emphasizes the importance of establishing a "multi-verification" awareness and not solely relying on what is seen or heard [2]. - Consumers are advised to adhere to the "three no's and two musts" prevention principles: do not trust, do not transfer money, and do not disclose personal information [2]. - It is crucial to verify any money transfer requests made through non-face-to-face methods, even if they appear to come from familiar voices or faces, and to be cautious of urgent situations that hinder verification [2]. Group 3: Verification and Reporting - Consumers should verify requests for money from "friends or family" by hanging up and calling back using stored contact information or confirming through mutual acquaintances [3]. - Observing for subtle signs of AI-generated content, such as unnatural facial expressions or audio discrepancies, can help identify scams [3]. - In case of suspected fraud, consumers are urged to report to the police immediately and provide evidence such as account details, contact information, chat records, and transfer receipts [3].
山西省消协发布警示:当心“AI换脸”“AI配音”新型诈骗
Zhong Guo Xin Wen Wang·2025-12-12 03:08