robots.txt Tester
Tests path access against a robots.txt body using the Google specification: longest match wins, Allow beats Disallow on tie, supports * and $ wildcards. Useful before pushing changes to production.
robots.txt body
Test request
Just the path, like /admin/users
Disallowed
- Path
- /admin/users
- User agent
- GPTBot
- Matched group
- gptbot
- Reason
- Longest matching rule: disallow /
Apply this with your coding agent
Get a ready-made prompt that wraps the output above with implementation steps. Paste it into your AI assistant and let it ship the change.
How matching works
This follows the rules used by Googlebot and most major crawlers. The user agent is matched against User-agent lines using substring match (longest specific name wins). Within the matched group, the longest matching path rule wins, with Allow beating Disallow on a tie. Wildcards * and end-anchor $ are supported.
Verify on a real URL
Run a full agent readiness scan to see how your live site responds end to end.