ToolPal

Robots.txt 生成器

使用可视化编辑器为您的网站创建 robots.txt 文件,控制搜索引擎爬虫可以抓取的页面。

快速预设

User-agent:
Crawl-delay:秒(可选)
代理块 #1
User-agent:
Crawl-delay:秒(可选)
代理块 #2
User-agent:
Crawl-delay:秒(可选)
代理块 #3

生成的 robots.txt

# Generated by ToolBox Hub

User-agent: *
Disallow: /admin/
Disallow: /private/
Allow: /

User-agent: Googlebot
Allow: /

User-agent: Bingbot
Allow: /

将此内容保存为 robots.txt 文件并放置在网站根目录中。

使用方法

使用可视化编辑器为您的网站创建 robots.txt 文件,控制搜索引擎爬虫可以抓取的页面。

  1. 1Choose a preset (Allow All, Block All, or SEO-Friendly) to load a starting configuration instantly.
  2. 2Customize each User-agent block: change the bot name, add Allow or Disallow path rules, and set an optional Crawl-delay.
  3. 3Add more User-agent blocks with the 'Add User-agent Block' button to target specific bots with different rules.
  4. 4Optionally enter your Sitemap URL to include a Sitemap directive in the output.
  5. 5Click 'Copy' to copy the generated robots.txt to your clipboard, then save it as robots.txt in your website's root directory.

主要功能

  • Three quick presets: Allow All, Block All, and SEO-Friendly (Google + Bing)
  • Add multiple User-agent blocks, each with their own Allow/Disallow rules and Crawl-delay
  • Autocomplete suggestions for common bots (Googlebot, Bingbot, AhrefsBot, and more)
  • Optional Sitemap URL field for the Sitemap directive
  • Live robots.txt preview updates as you type — copy with one click

常见问题

了解更多