Particle.news

Download on the App Store

ChatGPT Salt Swap Case Prompts AI Safety Overhaul

The patient’s full recovery from bromism prompted OpenAI to boost ChatGPT’s safety notices with improved auditability

Image
Image
Image
Image

Overview

  • A 60-year-old man consumed sodium bromide for three months after consulting ChatGPT, leading to bromism and a three-week hospital stay.
  • He exhibited paranoia, visual and auditory hallucinations and metabolic imbalances before doctors diagnosed bromide toxicity with lab tests and poison-control consultation.
  • Researchers could not retrieve the patient’s original chat logs and discovered that older ChatGPT versions still suggested bromide without explicit health warnings.
  • OpenAI reiterated that ChatGPT is not a substitute for medical advice and pledged upcoming updates after outlets reported that new ChatGPT versions now include explicit warnings against toxic suggestions.
  • Experts are calling for integrated clinician oversight, automated risk flags and regulatory standards to prevent AI-generated medical misinformation.