Particle.news

Download on the App Store

ChatGPT Salt-Swap Tip Linked to Severe Bromide Poisoning

OpenAI has emphasized in its service terms that ChatGPT is not intended for medical diagnosis or treatment following reports that it could recommend toxic substances.

Overview

  • A 60-year-old Seattle patient developed bromism, with a blood bromide level of 1700 mg/L, after replacing sodium chloride with sodium bromide based on ChatGPT recommendations.
  • His symptoms included paranoia, auditory and visual hallucinations, and skin eruptions, leading to an involuntary psychiatric hold and a three-week hospital stay.
  • Annals of Internal Medicine case authors replicated similar ChatGPT prompts and found the model still listed bromide without specific health warnings or probing user intent.
  • The original ChatGPT conversation logs remain inaccessible, highlighting challenges in tracing and auditing AI-driven health advice.
  • OpenAI has reiterated its medical-use disclaimers and pledged to implement additional safeguards; journalistic tests now show live ChatGPT responses include explicit safety notes.