Crawler Control

Create robots rules with allow, disallow and sitemap lines

Build a clean robots.txt file for your site, choose which crawlers the rules apply to, add allow and disallow paths, and include one or more sitemap URLs.

01
Target all crawlers with User-agent: * or write rules for specific bots
02
Add multiple allow and disallow paths for each crawler group
03
Include sitemap URLs so crawlers can discover your XML sitemap

Robots.txt Generator

Create crawler groups, generate the final file, then copy or download it.

One path per line (must start with /)
One path per line. Use / to block everything
0Crawler groups
0Allow rules
0Disallow rules
0Sitemaps

πŸ“„ Generated robots.txt

No file generated yet
Add at least one crawler group, then click Generate.
# Click "Add Group" then "Generate" to create your robots.txt

πŸ“Œ Quick Tips

Best practices

βœ… Use root-relative paths: /admin/

βœ… Sitemap needs full URL: https://example.com/sitemap.xml

βœ… Use / to block all crawling

βœ… Add multiple groups for different bots