Particle.news

Download on the App Store

ChatGPT Diet Tip Triggered Severe Bromide Poisoning, Case Study Finds

A Washington state hospital case has prompted experts to warn that AI chatbots lack the clinical context to provide safe, personalized health recommendations.

Overview

  • After consulting ChatGPT, a 60-year-old man spent three months substituting table salt with sodium bromide, leading to toxic accumulation
  • He developed paranoia, auditory and visual hallucinations, facial acne and cherry angiomas as blood bromide levels soared 233 times above healthy limits
  • Admitted in early August, he received IV fluids and electrolytes, stabilized in the psychiatry unit and was discharged after a three-week hospital stay
  • The detailed report in Annals of Internal Medicine: Clinical Cases marks a rare modern occurrence of bromism since over-the-counter bromide salts were banned in 1975
  • Study authors caution that AI chatbots can offer scientifically inaccurate medical advice without patient-specific warnings or context, underscoring the need for professional oversight