Microsoft’s Copilot Terms Label the Tool ‘For Entertainment Only’
The stark disclaimer highlights a legal stance at odds with marketing that casts the assistant as work‑ready.
Overview
- Microsoft’s Copilot Terms of Use tell users the chatbot is for entertainment only and warn them not to rely on it for important advice.
- The document was last updated in late 2025 but is drawing fresh attention after coverage resurfaced the warning.
- At recent product demos, Microsoft cautioned that Copilot can be wrong and that people must check its answers before using them.
- A reported example says the Welsh government used Copilot in a review to justify closing an organization, which shows the risks when officials lean on AI output.
- The Register also verified that Anthropic shows a stricter, non‑commercial clause to European visitors, signaling wider regional limits on how AI tools can be used.