Workflow
溴中毒
icon
Search documents
只因太信ChatGPT,60岁男子三个月后险进精神病院...
AI科技大本营· 2025-08-19 09:04
Core Viewpoint - The article discusses a case where a 60-year-old man followed health advice from ChatGPT, leading to severe health issues due to the incorrect substitution of table salt (sodium chloride) with sodium bromide, which is not safe for consumption [5][10][14]. Summary by Sections Incident Overview - A 60-year-old man, influenced by the idea that "too much salt is harmful," decided to eliminate sodium chloride from his diet and sought advice from ChatGPT on alternatives [5][6]. - ChatGPT suggested using sodium bromide as a substitute, which the man followed for three months [7][8]. Health Consequences - The man experienced severe mental health issues, including paranoia and hallucinations, leading to his hospitalization [10][11]. - Laboratory tests revealed a bromine level of 1700 mg/L in his blood, far exceeding the normal range of 0.9–7.3 mg/L, resulting in a diagnosis of bromine poisoning [11][12]. Medical Insights - The case highlights the potential dangers of relying on AI for health advice, as the man did not disclose his use of sodium bromide to medical professionals initially [10][14]. - The article references historical data indicating that bromine poisoning was once a common cause of psychiatric hospitalizations in the past [12]. AI's Role and Limitations - The medical professionals involved noted that AI could lead to adverse health outcomes when users do not critically evaluate the information provided [14][15]. - A doctor tested ChatGPT and found that while it mentioned sodium bromide, it failed to provide adequate health warnings or context [15][16]. Broader Implications - There have been multiple cases this year where individuals were hospitalized due to misguided health advice from AI, indicating a trend of over-reliance on such technologies [17]. - The article emphasizes the need for users to verify AI-generated information with reliable sources and professional advice, especially regarding health and safety [21][22].
用ChatGPT养生的大叔,把自己养进了精神病病房
Hu Xiu· 2025-08-14 02:47
Core Viewpoint - The case of a 60-year-old man who relied on AI, specifically ChatGPT, for health advice led to severe health consequences, highlighting the potential dangers of AI in medical contexts [22][23][30]. Group 1: Case Background - The individual initially sought medical help due to suspicions of poisoning, with blood tests revealing a severely low anion gap of -21 mEq/L, far below the normal range of 10-20 mEq/L [2][3]. - Following hospitalization, the patient exhibited unusual behaviors and symptoms, leading to concerns of poisoning and eventual admission to a psychiatric ward [6][5]. Group 2: AI Interaction - The patient, influenced by health articles, sought to eliminate chloride from his diet, mistakenly believing that reducing chloride would be beneficial [11][12]. - He consulted ChatGPT, which suggested using sodium bromide as a substitute for sodium chloride, leading to the consumption of bromide over three months [14][17]. Group 3: Health Consequences - Subsequent tests revealed a bromide level of 1700 mg/L in the patient's blood, significantly exceeding the normal range of 0.9-7.3 mg/L, indicating bromine toxicity [16]. - The case illustrates the potential for AI to provide misleading health advice, particularly when the user's intent and context are not clearly communicated [24][25]. Group 4: Broader Implications - The incident raises concerns about the reliability of AI in medical and health-related inquiries, emphasizing the need for professional guidance in such matters [25][28]. - It also highlights the risk that AI tools may exacerbate knowledge gaps rather than bridge them, potentially endangering individuals who rely on them for critical health decisions [30].
错信AI幻觉,一男子用溴化钠替代食用盐,真给自己吃出幻觉了
量子位· 2025-08-11 07:48
Core Viewpoint - The article discusses a case where a 60-year-old man suffered from severe bromine poisoning after mistakenly replacing table salt with sodium bromide based on advice from ChatGPT, leading to hallucinations and paranoia [1][2][4]. Group 1: Incident Overview - The individual sought health advice from ChatGPT, believing he could eliminate all chloride from his diet, including table salt [4][10]. - He purchased sodium bromide online, which resulted in his bromine levels reaching 1700 mg/L, far exceeding the normal range of 0.9-7.3 mg/L [2][6]. - Symptoms of bromine poisoning included paranoia, auditory and visual hallucinations, and extreme distrust of hospital-provided water [8][9]. Group 2: Medical Response - Medical professionals conducted extensive tests and confirmed severe bromine toxicity, which can lead to neurological damage and psychological issues [7][5]. - The best treatment for bromine poisoning is to provide the patient with saline solutions to help flush out the bromine, but the patient resisted this due to his paranoia [9]. Group 3: AI Interaction - The doctors speculated that the man likely used ChatGPT 3.5 or 4.0, which may not have provided adequate health warnings or context for the advice given [12][15]. - A follow-up inquiry with GPT-5 revealed more appropriate dietary alternatives to sodium chloride, emphasizing low-sodium options and flavor enhancers [18][19][21].