What AI usage signals can and cannot do
AI usage signals help a site publish a clearer machine-readable posture. They do not prove that every system will interpret, honor, or enforce that posture in the same way.
What they can do
They can reduce ambiguity. They can help a system understand whether the site is trying to distinguish search discovery, answer-input use, and training-related use. They can make a policy surface easier to summarize and reason about.
What they cannot do
They cannot, by themselves, guarantee blocking, legal force, identity verification, or universal obedience. They are signals, not hard controls.
Why Better Robots.txt includes them
The plugin includes them because publishing a clearer signal is better than leaving the site silent. But the plugin also treats them as part of a layered policy stack, not as magic enforcement.
What Better Robots.txt is not
Better Robots.txt is not a WAF, not a signed-agent verification system, not a legal enforcement layer, and not a guarantee that every crawler will comply. It publishes a clearer WordPress policy surface and a safer workflow for the parts you can actually govern.