Generate standard robots.txt, Blogger-style robots.txt, or a simple sitemap.xml. Fill your values and click Generate → Copy / Download.
If left blank, generator will propose <siteURL>/sitemap.xml
.
Each rule: User-agent, then Allow/Disallow lines (comma-separated paths), optional Crawl-delay.
Tip: For Blogger, copy the Blogger robots.txt into Settings → Crawlers and indexing → Custom robots.txt. For standard sites place robots.txt
in your site root. For sitemap, upload sitemap.xml to your root or use your CMS sitemap URL and submit to Google Search Console.