Particle.news

Download on the App Store

Anthropic Sets Sept. 28 Default to Train on Most Consumer Claude Chats

The shift makes training the default for consumer chats, raising questions about meaningful consent.

Overview

  • The change covers Claude Free, Pro, Max, and Claude Code users, with enterprise, government, education, and API access via Amazon Bedrock or Google Vertex excluded.
  • Existing users will be included unless they opt out by September 28, and new users can disable the “Help improve Claude” toggle at sign-up or later in privacy settings.
  • Anthropic will retain data from opted-in accounts for up to five years, replacing the prior 30‑day retention policy.
  • Only new and resumed conversations from opted-in users will be used for training, leaving older chats out of scope.
  • Anthropic cites safety and capability gains from real‑world data, as critics fault the opt‑out design and experts note FTC warnings about unclear policy changes.