Anthropic will start training its AI models on user data, including new chat transcripts and coding sessions, unless users choose to opt out. It's also extending its data retention policy to five years - again, for users that don't choose to opt out. All users will have to make a decision by September 28th. For users that click "Accept" now, Anthropic will immediately begin training its models on their data and keeping said data for up to five years, according to a blog post published by Anthropic on Thursday. The setting applies to "new or resumed chats and coding sessions." Even if you do agree to Anthropic training its AI models on yo …Read the full story at The Verge.