Robots.txt Generator
Create a robots.txt file for your website with a visual editor. Control which search engine bots can crawl your pages.
Quick Presets
User-agent:
Crawl-delay:seconds (optional)
Agent block #1
User-agent:
Crawl-delay:seconds (optional)
Agent block #2
User-agent:
Crawl-delay:seconds (optional)
Agent block #3
Generated robots.txt
# Generated by ToolBox Hub User-agent: * Disallow: /admin/ Disallow: /private/ Allow: / User-agent: Googlebot Allow: / User-agent: Bingbot Allow: /
Save this content as robots.txt and place it in the root of your website.
How to Use
Create a robots.txt file for your website with a visual editor. Control which search engine bots can crawl your pages.
- 1Choose a preset (Allow All, Block All, or SEO-Friendly) to load a starting configuration instantly.
- 2Customize each User-agent block: change the bot name, add Allow or Disallow path rules, and set an optional Crawl-delay.
- 3Add more User-agent blocks with the 'Add User-agent Block' button to target specific bots with different rules.
- 4Optionally enter your Sitemap URL to include a Sitemap directive in the output.
- 5Click 'Copy' to copy the generated robots.txt to your clipboard, then save it as robots.txt in your website's root directory.
Key Features
- Three quick presets: Allow All, Block All, and SEO-Friendly (Google + Bing)
- Add multiple User-agent blocks, each with their own Allow/Disallow rules and Crawl-delay
- Autocomplete suggestions for common bots (Googlebot, Bingbot, AhrefsBot, and more)
- Optional Sitemap URL field for the Sitemap directive
- Live robots.txt preview updates as you type — copy with one click