Particle.news

Download on the App Store

ChatGPT Salt Substitute Advice Leads to Bromide Poisoning Hospitalization

A new report reveals that tests confirmed AI can recommend toxic compounds without warning.

Overview

  • After three weeks of treatment with saline diuresis, electrolyte replacement and antipsychotics, the patient stabilized and was discharged following severe bromide intoxication.
  • The Annals of Internal Medicine case report details his intoxication and replicates a ChatGPT v3.5 query that listed sodium bromide as a salt alternative without toxicity warnings.
  • Authors warn that generative AI tools can dispense dangerous medical misinformation and urge users to verify health advice with qualified professionals.
  • Coverage cites a 2025 survey showing 35% of Americans use AI for health guidance and 63% trust its medical advice despite vendor disclaimers.
  • No specific regulatory or platform-level actions have been reported in response to the incident, raising concerns over AI accountability in healthcare.