Examples
Examples should not only show syntax. They should show decisions.
That is the real goal of this section: help site owners understand why a given robots.txt or governance pattern makes sense for a certain kind of website.
Start here
If you need a broad overview first:
What this section contains
Pattern-based examples
These examples map a site profile to a safer policy starting point:
- Small Business on Essential
- Publisher on AI-First
- WooCommerce Crawl Control
- Fortress for Sensitive Sites
- Custom Rollout for Agencies
Robots.txt examples
The canonical examples page is still here:
Use it for line-level examples and familiar file structures.
Migration example
If the question is "how do I move from a hand-edited file or another plugin to Better Robots.txt?", start here:
How to use examples correctly
Use examples to:
- understand trade-offs
- choose a safer starting preset
- reason about crawler categories
- compare site types and publishing goals
Do not use examples to:
- prove a live site’s runtime state
- guarantee crawler obedience
- assume every example applies unchanged to your stack
- infer security or legal force from policy signals alone
Recommended path
- Read the guide.
- Choose the closest use case.
- Select a pattern.
- Compare that pattern against the documented preset.
- Validate with the final review step before publishing.