Anthropic will soon begin using your chat transcripts to train its popular chatbot, Claude. The announcement came on Thursday as an update to the company's Consumer Terms and Privacy Policy.
New users will see an option to "Help improve Claude" that can be toggled on or off as part of the sign-up flow, where existing users will begin to see a notification explaining the change. Users have until Sep 28 to opt out of the new change, as it will be enabled by default. You can still turn the option off in Claude's privacy settings.
A representative for Anthropic declined to comment.
Don't miss any of CNET's unbiased tech content and lab-based reviews. Add us as a preferred Google source on Chrome.
What's changing?
Individual users using Claude Free, Pro, Max or Code sessions from one of the aforementioned plans will be affected by this change when it goes into effect in late September. With the option enabled, AI training will only take place on new and resumed chat and coding sessions. Older chats that aren't revisited won't be affected -- at least, for now.
There are exceptions to the updated policy. Claude for Work (Team and Enterprise plans), Claude Gov and Claude Education will not be affected. This also extends to Application Programming Interface use for third parties, including Amazon Bedrock and Google Cloud's Vertex AI.
Until the deadline on Sept. 28, users can ignore the notification by closing it or choosing the "not now" option. After the deadline, users must make a choice in order to continue using Claude.
Read more: Anthropic's Claude Extension Bakes AI Right Into the Chrome rowser
Data retention changes
Users who opt in to AI model training will also have their data held for significantly longer than the previous 30-day window. Opted-in users will have data stored for 5 years, and like the update training policy, will only apply to new or resumed chats.
Anthropic says the extended data retention will allow the company to identify misuse and detect harmful usage patterns.
How to update your Privacy settings in Claude
If you opted into the new changes by accident or changed your mind for any reason, you can always opt out. Here's how.
From the web:
Click your user icon on the bottom left
Click Settings
Click Privacy from the side panel
from the side panel Toggle Help improve Claude to off
From mobile:
Tap the stacked lines icon at the top left
at the top left Tap the Settings cog
Tap Privacy
Toggle Help improve Claude to off
If you previously opted in and decide to opt out, your new and resumed chats will no longer be used for future AI model training. Any qualifying data will still be included in training that has already started and models that have already been trained, but it won't be used in future training.