Most site owners don't realize how much of their content large language models (LLMs) already gather. ChatGPT, Claude, and Gemini pull from publicly available pages unless you tell them otherwise. That's where LLMs.txt for SEO comes into the picture.LLMs.txt gives you a straightforward way to tell AI crawlers how your content can be used. It doesn't change rankings, but it adds a layer of control over model training, something that wasn't available before.
Anthropic is making some big changes to how it handles user data, requiring all Claude users to decide by September 28 whether they want their conversations used to train AI models. While the company directed us to its blog post on the policy changes when asked about what prompted the move, we've formed some theories of our own. But first, what's changing: previously, Anthropic didn't use consumer chat data for model training.