Overview
- A 60-year-old man substituted sodium chloride with sodium bromide for three months after following ChatGPT’s advice, leading to toxic bromide buildup
- He arrived at the emergency department with paranoia and within 24 hours developed auditory and visual hallucinations requiring involuntary psychiatric admission
- Laboratory analysis revealed bromide levels far above normal and treatment with intravenous fluids led to full recovery and discharge after three weeks without further medication
- Bromism was once common in early 20th-century psychiatric wards until FDA bans, but internet sales of bromide compounds have reintroduced poisoning risks
- The case underscores that generative AI can offer harmful medical recommendations without context or professional oversight