Core Viewpoint - The article discusses a case where a 60-year-old man followed health advice from ChatGPT, leading to severe health issues due to the incorrect substitution of table salt (sodium chloride) with sodium bromide, which is not safe for consumption [5][10][14]. Summary by Sections Incident Overview - A 60-year-old man, influenced by the idea that "too much salt is harmful," decided to eliminate sodium chloride from his diet and sought advice from ChatGPT on alternatives [5][6]. - ChatGPT suggested using sodium bromide as a substitute, which the man followed for three months [7][8]. Health Consequences - The man experienced severe mental health issues, including paranoia and hallucinations, leading to his hospitalization [10][11]. - Laboratory tests revealed a bromine level of 1700 mg/L in his blood, far exceeding the normal range of 0.9–7.3 mg/L, resulting in a diagnosis of bromine poisoning [11][12]. Medical Insights - The case highlights the potential dangers of relying on AI for health advice, as the man did not disclose his use of sodium bromide to medical professionals initially [10][14]. - The article references historical data indicating that bromine poisoning was once a common cause of psychiatric hospitalizations in the past [12]. AI's Role and Limitations - The medical professionals involved noted that AI could lead to adverse health outcomes when users do not critically evaluate the information provided [14][15]. - A doctor tested ChatGPT and found that while it mentioned sodium bromide, it failed to provide adequate health warnings or context [15][16]. Broader Implications - There have been multiple cases this year where individuals were hospitalized due to misguided health advice from AI, indicating a trend of over-reliance on such technologies [17]. - The article emphasizes the need for users to verify AI-generated information with reliable sources and professional advice, especially regarding health and safety [21][22].
只因太信ChatGPT,60岁男子三个月后险进精神病院...
AI科技大本营·2025-08-19 09:04