RapidBot GUI | Adding a rule

robot_GUI.png

Previous: Adding a RapidBot Page

Robots.txt file is made of simple rules, made of one or more directive(s). Each directive can enable or disable access to specific content.

Add a new rule clicking on + button on the bottom left corner.

From right pane, you'll be able to specify:

Which bot(s) rule apply to.

RapidBot_Spiders.png
* means every bot. You may want to set a custom user-agent or choose from the pre-defined list. A comprehensive list of known bots is available here.

RapidBot_custom_bot.png

Adding a directive to the rule:

Add a new directive clicking on + button on the bottom right corner.

Choose directive type:

RapidBot_Folders_files.png

Choose Allow or Disallow for content:

RapidBot_Allow_Disallow.png

Choose a content filter:

All content means every file on site, or you can set more specify filters based on files (and folder) name and extensions.
RapidBot_content_selection.png

  • e.g. : Choose Folder / File and enter the Folder or File name in the following field. Pushing the ... will prompt the RapidWeaver standard link picker.

Link_Picker.png

  • e.g. : If you want spiders to not crawl your php configuration files, you can choose All content with extension and enter php in the following field. This will create a rule such Disallow (or Allow) /* .php$.

Comment:

Enter a text comment that will be included in robots.txt rule. Bots won't read this information.

Clicking on - button will remove the selected rule or directive.

Next: RapidBot settings