Workflow
溴中毒
icon
Search documents
只因太信ChatGPT,60岁男子三个月后险进精神病院...
AI科技大本营· 2025-08-19 09:04
Core Viewpoint - The article discusses a case where a 60-year-old man followed health advice from ChatGPT, leading to severe health issues due to the incorrect substitution of table salt (sodium chloride) with sodium bromide, which is not safe for consumption [5][10][14]. Summary by Sections Incident Overview - A 60-year-old man, influenced by the idea that "too much salt is harmful," decided to eliminate sodium chloride from his diet and sought advice from ChatGPT on alternatives [5][6]. - ChatGPT suggested using sodium bromide as a substitute, which the man followed for three months [7][8]. Health Consequences - The man experienced severe mental health issues, including paranoia and hallucinations, leading to his hospitalization [10][11]. - Laboratory tests revealed a bromine level of 1700 mg/L in his blood, far exceeding the normal range of 0.9–7.3 mg/L, resulting in a diagnosis of bromine poisoning [11][12]. Medical Insights - The case highlights the potential dangers of relying on AI for health advice, as the man did not disclose his use of sodium bromide to medical professionals initially [10][14]. - The article references historical data indicating that bromine poisoning was once a common cause of psychiatric hospitalizations in the past [12]. AI's Role and Limitations - The medical professionals involved noted that AI could lead to adverse health outcomes when users do not critically evaluate the information provided [14][15]. - A doctor tested ChatGPT and found that while it mentioned sodium bromide, it failed to provide adequate health warnings or context [15][16]. Broader Implications - There have been multiple cases this year where individuals were hospitalized due to misguided health advice from AI, indicating a trend of over-reliance on such technologies [17]. - The article emphasizes the need for users to verify AI-generated information with reliable sources and professional advice, especially regarding health and safety [21][22].
错信AI幻觉,一男子用溴化钠替代食用盐,真给自己吃出幻觉了
量子位· 2025-08-11 07:48
Core Viewpoint - The article discusses a case where a 60-year-old man suffered from severe bromine poisoning after mistakenly replacing table salt with sodium bromide based on advice from ChatGPT, leading to hallucinations and paranoia [1][2][4]. Group 1: Incident Overview - The individual sought health advice from ChatGPT, believing he could eliminate all chloride from his diet, including table salt [4][10]. - He purchased sodium bromide online, which resulted in his bromine levels reaching 1700 mg/L, far exceeding the normal range of 0.9-7.3 mg/L [2][6]. - Symptoms of bromine poisoning included paranoia, auditory and visual hallucinations, and extreme distrust of hospital-provided water [8][9]. Group 2: Medical Response - Medical professionals conducted extensive tests and confirmed severe bromine toxicity, which can lead to neurological damage and psychological issues [7][5]. - The best treatment for bromine poisoning is to provide the patient with saline solutions to help flush out the bromine, but the patient resisted this due to his paranoia [9]. Group 3: AI Interaction - The doctors speculated that the man likely used ChatGPT 3.5 or 4.0, which may not have provided adequate health warnings or context for the advice given [12][15]. - A follow-up inquiry with GPT-5 revealed more appropriate dietary alternatives to sodium chloride, emphasizing low-sodium options and flavor enhancers [18][19][21].