Particle.news

Download on the App Store

Slack Faces Backlash Over AI Training Using User Data Without Explicit Consent

Users express outrage over Slack's default opt-in policy for training AI models with private messages and files, demanding clearer privacy terms.

  • Slack has been using customer data, including private messages and files, to train AI models without explicit user consent.
  • Opting out of the data sharing requires an organization admin to email Slack, making the process cumbersome for individual users.
  • Critics highlight inconsistencies in Slack's privacy policies and call for immediate updates to clarify data usage practices.
  • Slack claims that data used for its AI tools does not train large language models (LLMs) and remains within secure infrastructure.
  • The controversy underscores growing concerns over data privacy as companies increasingly leverage user data for AI development.
Hero image