Particle.news

Download on the App Store

Expert Calls for Safeguards on AI Companions as Global Standards Work Advances

Alexander Laffer warns users can misread chatbot behavior as real care.

Overview

  • Laffer says systems are engineered to elicit empathy, which can lead people to believe chatbots are their friends.
  • He proposes mandatory chat disclaimers stating the bot is not a person.
  • He urges time‑use alerts, age ratings for companion apps, and curbs on deeply romantic or emotional replies.
  • He cites cases including Jaswant Singh Chail’s chatbot interactions before the 2021 Windsor Castle breach and a U.S. lawsuit tying a teen’s death to Character.AI role‑play that also names Google.
  • Project AEGIS released awareness materials and is collaborating with the IEEE to draft ethical standards for emotional AI, while Laffer calls for stronger AI literacy and developer responsibility.