Skip to main contentSkip to content

Use Cases

Map real website situations to a safer starting point in Better Robots.txt.

The key idea is simple: do not start with the strongest preset. Start with the preset that fits the real site profile, the actual business risk, and the publishing intent.

Small business website

Recommended start: Essential
Why: most small business sites need clarity, proportion, and safe defaults more than aggressive restriction.

Use Essential when you want to:

  • stay discoverable in classic search
  • block obvious low-value WordPress paths
  • avoid over-configuring robots.txt too early
  • preserve social previews and public visibility

Move beyond Essential only if AI governance, archive control, or stricter bot policy becomes a real business requirement.

Pattern reference: Small Business on Essential

Content publisher or editorial site

Recommended start: AI-First
Why: publishers often want to stay visible in search while expressing clearer AI usage boundaries.

AI-First is useful when you want:

  • stronger separation between search indexing and AI training
  • optional llms.txt support
  • more deliberate crawler governance for content-heavy sites
  • a more explicit public policy toward answer-generation systems

Pattern reference: Publisher on AI-First

WooCommerce store

Recommended start: Essential or AI-First
Why: stores usually need better crawl hygiene more than broad aggressive blocking.

Use Better Robots.txt here to:

  • reduce crawl waste on cart, checkout, account, filter, and parameter-heavy URLs
  • keep high-value public pages discoverable
  • avoid duplicate or low-value crawl paths
  • maintain a cleaner balance between openness and restriction

Pattern reference: WooCommerce Crawl Control

Agencies and advanced operators

Recommended start: Custom
Why: agencies often manage multiple client profiles, risk models, and publishing goals.

Custom is best when you already know:

  • which crawler categories matter most
  • what should stay open or restricted
  • which policy trade-offs are acceptable for the client
  • which modules should be activated by business model, not by instinct

Pattern reference: Custom Rollout for Agencies

Protection-first or archive-sensitive sites

Recommended start: Fortress
Why: some sites care more about archive control, bot exposure, or broad scraping risk than openness.

Fortress is relevant when:

  • archive capture is undesirable
  • scraping risk matters more than broad openness
  • a more restrictive starting posture is justified
  • the team can accept the trade-offs of a tighter file

Pattern reference: Fortress for Sensitive Sites

AI-aware publishing workflows

Recommended start: AI-First
Why: some sites want to distinguish indexing, answer-generation use, and training more explicitly.

This is where Better Robots.txt becomes more than a file editor. It becomes a policy-publishing workflow.

Product-intent routes

When the user is not asking for a preset yet, these pages usually match the question better:

Decision shortcuts

If your priority is:

  • safe default discovery → start with Essential
  • clearer AI usage boundaries → start with AI-First
  • protection-first posture → start with Fortress
  • agency or expert composition → start with Custom

See also: