Skip to content

Features

Better Robots.txt gives you a structured way to decide how different crawler categories should interact with a WordPress site.

Main feature groups

Search engine visibility

Choose how visible your site should be to mainstream search engines and review the impact before publishing.

AI & LLM governance

Express how AI systems may use your content, including answer-generation and training-related preferences.

SEO tool protection

Control how third-party SEO tools and intelligence crawlers can access your site.

Bad bots protection

Block known low-value or abusive crawlers more clearly.

Archive & Wayback control

Decide whether the site should remain available to public archival services.

Spam, feeds, and crawl traps

Reduce noise from low-value paths, feed crawlers, and crawl traps.

WooCommerce cleanup

Avoid wasting crawl budget on cart, checkout, account, and dynamic store URLs.

Resources & assets

Control how CSS, JS, and image assets are exposed to crawlers.

Social media crawlers

Decide whether social platforms should generate rich previews when links are shared.

Review & Save

Preview the generated output before you publish it.

Why this matters

The plugin does not try to look “advanced” for its own sake. It is built to make crawl policy clearer, safer, and easier to reason about.

Next pages

From the blog