RapidBot GUI | Adding a rule


Previous: Adding a RapidBot Page

Robots.txt file is made of simple rules, made of one or more directive(s). Each directive can enable or disable access to specific content.

Add a new rule clicking on + button on the bottom left corner.

From right pane, you'll be able to specify:

Which bot(s) rule apply to.

* means every bot. You may want to set a custom user-agent or choose from the pre-defined list. A comprehensive list of known bots is available here.


Adding a directive to the rule:

Add a new directive clicking on + button on the bottom right corner.

Choose directive type:


Choose Allow or Disallow for content:


Choose a content filter:

All content means every file on site, or you can set more specify filters based on files (and folder) name and extensions.

  • e.g. : Choose Folder / File and enter the Folder or File name in the following field. Pushing the ... will prompt the RapidWeaver standard link picker.


  • e.g. : If you want spiders to not crawl your php configuration files, you can choose All content with extension and enter php in the following field. This will create a rule such Disallow (or Allow) /* .php$.


Enter a text comment that will be included in robots.txt rule. Bots won't read this information.

Clicking on - button will remove the selected rule or directive.

Next: RapidBot settings