Playbook for SEO leads
Your main question is usually: "Which preset gives me the right crawl behavior without damaging discovery?"
Read first
Your risk model
You care most about:
- crawl waste;
- index quality;
- low-value URL patterns;
- preserving discovery for strategic pages;
- not overselling what robots.txt can do.
Do
- start from a preset, then adjust contextually;
- differentiate search indexing, AI input, training, and archives;
- use WooCommerce-specific rules when the store introduces crawl noise;
- keep runtime validation separate from documentation claims.
Don’t
- promise ranking improvements as if they were direct and guaranteed;
- treat policy files as force;
- generalize one validated stack into universal compatibility.