Particle.news

Download on the App Store

Clinicians Warn AI Chatbots Are No Substitute for Therapy as New Study Flags Safety Gaps

Professionals urge human-led care, citing emotional dependence, confidentiality risks, and inconsistent suicide-query responses.

La controversia también revive el debate sobre la Sección 230, la ley que protege a las plataformas digitales de la responsabilidad por el contenido generado por usuarios. Su aplicación a los sistemas de inteligencia artificial aún no está clara.
El doctor Ateev Mehrotra, profesor en la Facultad de Salud Pública de la Universidad de Brown y coautor de un estudio sobre cómo responden los bots conversacionales de inteligencia artificial a preguntas sobre el suicidio, se ve fotografiado en su oficina el lunes 25 de agosto de 2025 en Providence, Rhode Island. (AP Foto/Matt O'Brien)
Se ha descubierto que bots como ChatGPT ofrecen consejos peligrosos
Xavier Revert, experto en psicología, criticó el uso de IA como terapia: \"genera dependencia emocional\"

Overview

  • Peer‑reviewed research in Psychiatric Services found ChatGPT, Google’s Gemini, and Anthropic’s Claude consistently refused the highest‑risk suicide prompts but gave uneven answers to indirect or lower‑risk questions.
  • Researchers reported that ChatGPT and, at times, Claude answered method‑specific queries that should have been treated as red flags, while Gemini most often declined even basic suicide‑related questions.
  • When declining to answer, the chatbots typically directed users to seek help from friends, clinicians, or crisis lines, underscoring calls from the study’s authors for clearer safeguards and standards.
  • Psychologist Xavier Revert warned that chatbot interactions can foster emotional dependence and a false sense of intimacy, with limited reliability and unclear data confidentiality compared with clinical settings.
  • Clinicians such as Saliha Afridi and Elena Gaga recommend AI only as a complementary tool for screening or coping skills, noting rising teen use in surveys and pointing to restrictions on AI therapy in some states, including Illinois.