FAQ
Is the free edition useful on its own?
Yes. The free edition already gives you guided setup, a useful preset, core controls, and a final preview step.
What it does not do is collapse all advanced governance choices into the free tier.
Do screenshots only show the free edition?
No. Screenshots on this site may show Free, Pro, or Premium capabilities.
They help explain the product, but they do not prove that every screenshoted option exists in every edition.
Can I block AI crawlers?
You can configure AI-related behavior and publish AI usage preferences, but those outputs remain declarative signals, not hard enforcement.
That distinction matters more than ever as some modern agent traffic is user-triggered or verified at the infrastructure layer.
Why not treat all crawlers the same?
Because search engines, training crawlers, answer systems, archive services, SEO tools, user-triggered fetchers, and abusive bots do not create the same value or the same risk profile.
A modern policy should separate them before publication.
Is Google-Agent the same thing as Google-Extended?
No.
Google-Extended is about downstream Google reuse for future Gemini training and some other grounding uses. Google-Agent is user-triggered Google agent traffic. They are different surfaces and should not be governed as if they were the same lever.
Does blocking training also block discoverability?
Not necessarily.
In many ecosystems you can keep search visibility while refusing some training-related reuse, as long as the search crawler remains allowed and you block only the training-related surface.
Does llms.txt replace robots.txt?
No.
robots.txt remains the crawl-access layer. llms.txt is a guidance layer that helps machine readers focus on the right content.
When do I need to move from plugin policy to edge controls?
When the problem involves signed agents, verified bot identity, allowlisting, runtime permissions, WAF rules, or abuse control.
That is infrastructure territory, not the core scope of the plugin.
Is Better Robots.txt a security product?
No. It helps you publish crawl policy and machine-readable governance. It does not replace WAF rules, authentication, or infrastructure controls.
Which preset should I start with?
For most sites, start with Essential.
Move to AI-First, Fortress, or Custom only when the site profile, publishing risk, or business goal genuinely requires it.
Can Better Robots.txt guarantee crawl-budget improvement?
No. It can improve crawl hygiene and reduce low-value paths, but it cannot guarantee crawl-budget or ranking outcomes.
Where do I get support?
Use the Contact page or email support@better-robots.com.
What should I read first if I want the full logic?
Start with: