Particle.news

Download on the App Store

Patient Develops Severe Bromism After Following ChatGPT’s Salt Substitute Advice

Researchers found ChatGPT still recommends sodium bromide as a salt substitute without health warnings, prompting OpenAI to vow stronger model safeguards.

Image
Image
Should you rely on ChatGPT for health advice? This case from New York may change your mind.
Image

Overview

  • A published case in the Annals of Internal Medicine describes a 60-year-old man who consumed sodium bromide for three months after ChatGPT suggested it as a table salt replacement, leading to hallucinations, paranoia and dermatologic symptoms.
  • Clinical tests measured his serum bromide concentration at about 1,700 mg/L—more than 170 times the normal level—resulting in a three-week hospital stay including involuntary psychiatric observation.
  • Without the original chat log, University of Washington researchers replicated the dietary query and confirmed that ChatGPT again listed bromide as a chloride substitute without issuing specific medical warnings.
  • The incident highlights the difficulty of auditing AI outputs, the risk of decontextualized medical advice and the need for clinician involvement when interpreting AI health recommendations.
  • OpenAI has pointed users to its terms of use that caution against relying on ChatGPT for medical decisions and has pledged to implement enhanced safety measures in upcoming model releases.