警惕!“AI合成语音”成网恋诈骗新武器,上海一男子被骗5万余元
Xin Lang Cai Jing·2026-01-13 06:22

Core Insights - The article discusses a case of online dating fraud involving AI technology, where a man impersonated a woman to deceive a victim out of over 50,000 yuan [1][2]. Group 1: Fraud Case Overview - The fraudster, identified as Zhai, used AI-generated female voice and a fictitious identity to establish a romantic relationship with the victim, Mr. Yao, leading to financial deception [1]. - Mr. Yao reported the fraud to the police on May 19, 2025, after being manipulated into sending money under various pretenses, including financial hardships and vehicle-related expenses [1]. - The police investigation led to Zhai's arrest on June 26, 2025, after gathering evidence of his fraudulent activities [1]. Group 2: Legal Proceedings - During the investigation, the prosecution provided guidance on evidence collection and fact verification, confirming the total amount defrauded was over 50,000 yuan [1][2]. - Zhai initially denied the charges, claiming the account was used by someone else, but eventually confessed after the evidence, including phone data and voice synthesis files, was presented [2]. - On September 19, 2025, the prosecution charged Zhai with fraud, and he was sentenced to one year and eight months in prison, along with a fine of 20,000 yuan [2]. Group 3: Industry Implications - The case highlights the emerging threat of AI-assisted fraud in online interactions, emphasizing the need for vigilance in online dating scenarios [2]. - Authorities recommend that individuals verify identities through video calls and be cautious of those who frequently request money without meeting in person [2].