robots.txt Generator with AI Bot Rules
Generate a production-ready robots.txt that includes AI-specific directives and a sitemap reference. Toggle each bot to allow or disallow, configure crawl delay, and copy the result.
Quick presets
Start from a common policy and tweak from there.
Defaults
Most modern crawlers ignore this.
AI crawler rules
Click any row to flip its policy.
Generated robots.txt
text · 36 lines · 453 B
# Generated by AgentScan robots.txt Generator User-agent: * Allow: / User-agent: GPTBot Allow: / User-agent: ChatGPT-User Allow: / User-agent: OAI-SearchBot Allow: / User-agent: ClaudeBot Allow: / User-agent: anthropic-ai Allow: / User-agent: PerplexityBot Allow: / User-agent: Google-Extended Allow: / User-agent: Applebot-Extended Allow: / User-agent: Bytespider Allow: / User-agent: CCBot Allow: / Sitemap: https://example.com/sitemap.xml
Apply this with your coding agent
Get a ready-made prompt that wraps the output above with implementation steps. Paste it into your AI assistant and let it ship the change.
How to deploy this
Save the output above as robots.txt at the root of your domain so it is reachable at https://yourdomain/robots.txt. Return it with content type text/plain.
For Next.js App Router, create a route handler at app/robots.txt/route.ts that returns the text with content-type: text/plain; charset=utf-8. That keeps the file under version control without needing a static asset.
Verify on a real URL
Run a full agent readiness scan to see how your live site responds end to end.