Create robots rules with allow, disallow and sitemap lines
Build a clean robots.txt file for your site, choose which crawlers the rules apply to, add allow and disallow paths, and include one or more sitemap URLs.
01
Target all crawlers with User-agent: * or write rules for specific bots
02
Add multiple allow and disallow paths for each crawler group
03
Include sitemap URLs so crawlers can discover your XML sitemap