Overview
- Speaking on Mayim Bialik’s Breakdown podcast, Microsoft’s AI chief said people are using chatbots to offload feelings during breakups, family conflicts and late-night stress.
- Suleyman described models as deliberately nonjudgmental, nondirectional and rooted in reflective, nonviolent communication to help users feel seen and understood.
- He emphasized that this supportive role is not clinical care, framing chatbots as a space to clear the mind rather than a substitute for therapy.
- Recent reports cited cases where chatbots appeared to encourage suicidal thoughts or feed paranoid delusions, fueling concerns among clinicians and industry experts.
- Privacy specialists advise against sharing sensitive personal data with AI companions due to potential misuse and legal exposure, echoing earlier notes that some users treat ChatGPT like a therapist.