Skip to content

Settings Overview

The current Better Robots.txt workflow is organized as a guided sequence rather than a flat list of toggles.

Main steps

  • Preset selection
  • Search engine visibility
  • AI & LLM governance
  • SEO tool protection
  • Bad bots protection
  • Archive & Wayback control
  • Spam, feeds, and crawl traps
  • E-commerce optimization
  • Resources & assets
  • Social media crawlers
  • Ads & revenue
  • llms.txt
  • Advanced settings
  • Review & Save

Why this matters

A robots.txt policy is easier to manage when users can reason step by step instead of trying to write everything manually at once.