RapidBot GUI | Adding a rule
Robots.txt file is made of simple rules, made of one or more directive(s). Each directive can enable or disable access to specific content.
Add a new rule clicking on
+ button on the bottom
From right pane, you'll be able to specify:
Which bot(s) rule apply to.
* means every bot. You may want to set a custom
user-agent or choose from the pre-defined list. A comprehensive
list of known bots is available here.
Adding a directive to the rule:
Add a new directive clicking on
+ button on the
bottom right corner.
Choose directive type:
Choose a content filter:
All content means every file on site, or you can
set more specify filters based on files (and folder) name and
- e.g. : Choose
Folder / Fileand enter the Folder or File name in the following field. Pushing the
...will prompt the RapidWeaver standard link picker.
- e.g. : If you want spiders to not crawl your
php configuration files, you can choose
All content with extensionand enter
phpin the following field. This will create a rule such
Disallow (or Allow) /* .php$.
Enter a text comment that will be included in robots.txt rule. Bots won't read this information.
- button will remove the selected rule
- How can I set a RapidCart page for donations with modifiable price?
- How can I dynamically change product price depending on values entered in a text field?
- How can I add a "More info..." button for longer product descriptions?
- Adding multiple thumbnails for each item
- How to add the cart to non-RapidCart pages
- View all (11 more)
- What are RapidViewer requirements?
- What documents formats are supported?
- Do I need a Google Account in order to view the document?
- Do I need to install additional tools/plugins in order to view the document?
- Why do I see a demo document while in Preview? Why doesn't the page work once exported?
- View all (8 more)