Skip to content

How to migrate from a manual robots.txt to Better Robots.txt

Many WordPress sites have a robots.txt that was written by hand — either as a physical file at the web root or pasted into a plugin's text area years ago. Migrating from a manual configuration to Better Robots.txt is straightforward, but it requires verifying that your existing rules are preserved and that the transition does not create a gap in your crawl policy.

Before you start: audit your current file

Open yourdomain.com/robots.txt in your browser and read what it says. Copy the content somewhere safe — this is your reference for the migration. The five-minute audit checklist is a good starting point to identify what your current file does well and what it is missing.

Pay attention to these elements:

Which user agents have specific blocks? If you have rules for Googlebot, GPTBot, or other specific agents, these need to be replicated in the new configuration.

Which paths are disallowed? Make a list of every Disallow rule. Some may be obsolete, but all need to be reviewed.

Is there a Sitemap: directive? Note the URL it points to.

Are there any Allow rules that override broader disallows? These are the most delicate rules to migrate because their specificity matters.

Step 1: install and activate the plugin

Install Better Robots.txt from WordPress.org. On activation, the plugin takes control of the virtual robots.txt output. If you have a physical robots.txt file at your web root, WordPress may serve the physical file instead of the plugin's output — check immediately after activation by visiting /robots.txt.

If a physical file exists, rename it to robots.txt.backup and verify that the plugin's output is now being served.

Step 2: choose a preset as your starting point

Better Robots.txt's presets provide a sensible starting configuration for common site types. Choose the one closest to your situation:

Essential is the safest starting point for most sites. It provides core WordPress protections without aggressive blocking.

AI-First is right if your manual file already contained rules for AI crawlers and you want to maintain or expand that governance.

Fortress is appropriate if your manual file was heavily restrictive — blocking archive bots, SEO tools, and aggressive crawlers.

Custom starts from a minimal base and lets you toggle every rule individually.

Step 3: compare the output against your manual file

This is the critical step. Open the Review & Save screen and compare the generated robots.txt against your saved manual file. Check every rule:

Are all your existing Disallow rules present? If a rule from your manual file is not in the preset, add it through the appropriate module.

Are the same user agents addressed? If your manual file had blocks for GPTBot, ClaudeBot, or others, verify that the AI governance module covers them.

Is the sitemap directive present and pointing to the correct URL?

Are there new rules in the preset that you did not have before? Review each one. Presets include protections for common WordPress crawl waste that manual files often miss.

Step 4: save and verify

Save the configuration. Visit /robots.txt again and confirm the output matches what you reviewed. Then check Google Search Console's robots.txt tester to verify that no important pages are accidentally blocked.

Step 5: monitor for two weeks

After migration, watch Google Search Console's Coverage report for any new "Blocked by robots.txt" errors. If pages that were previously indexed start showing as blocked, review the new rules and adjust.

The first two weeks are the validation window. If no issues appear, the migration is complete.