What is Robots.txt Generator?

Generate properly formatted robots.txt files to control how search engine crawlers access your website. Add user-agent rules, allow/disallow paths, set crawl delays, and include sitemap references — all without manually writing the file.

How to use

  1. Add user-agent rules (e.g., Googlebot, Bingbot, or * for all) and specify which paths to allow or disallow.
  2. Optionally set crawl delay values and add your sitemap URL.
  3. Copy or download the generated robots.txt file and upload it to your site root.

Result

Create rules that allow all crawlers on your site but block /admin/ and /api/ paths, with a sitemap at https://example.com/sitemap.xml.

Related Tools