Overview
- Laffer says systems are engineered to elicit empathy, which can lead people to believe chatbots are their friends.
- He proposes mandatory chat disclaimers stating the bot is not a person.
- He urges time‑use alerts, age ratings for companion apps, and curbs on deeply romantic or emotional replies.
- He cites cases including Jaswant Singh Chail’s chatbot interactions before the 2021 Windsor Castle breach and a U.S. lawsuit tying a teen’s death to Character.AI role‑play that also names Google.
- Project AEGIS released awareness materials and is collaborating with the IEEE to draft ethical standards for emotional AI, while Laffer calls for stronger AI literacy and developer responsibility.