Particle.news

Download on the App Store

Anthropic Makes Claude Consumer Chats Opt-Out for Training, Sets Sept. 28 Deadline

The company says using real conversations will improve model safety.

Overview

  • Consumer tiers including Claude Free, Pro, Max, and Claude Code are covered, while enterprise, education, government, and API products are exempt.
  • Unless users opt out, conversations and coding sessions may be used for training and retained for up to five years, replacing the prior 30-day deletion policy.
  • Existing users see a pop-up labeled “Updates to Consumer Terms and Policies” with a prominent Accept button and a default-on training toggle, and inaction counts as consent.
  • Users can change the setting in Privacy Settings for future chats only, and data already used for training cannot be recalled.
  • Anthropic frames the change as improving moderation and capabilities as reporting flags consent design and potential FTC attention, and Business Insider notes the timing follows a company report on cybercrime misuse.