Anthropic, the company behind Claude, is making a significant change to how it handles your data. Starting today, Claude users will be asked to either let Anthropic use their chats to train future AI models or opt out and keep their data private. If you don’t make a choice by September 28, 2025, you’ll lose access to Claude altogether.

Previously, Anthropic had a privacy-first approach, meaning your chats and code were automatically deleted after 30 days unless required for legal reasons. Starting today, however, unless you opt out, your data will be stored for up to five years and fed into training cycles to help Claude get smarter.

The new policy applies to all plans including Free, Pro, and Max, as well as Claude Code under those tiers. Business, government, education and API users ar

See Full Page