SEO

Other articles in this category

Search More Content

Generic selectors
Exact matches only
Search in title
Search in content
Search in posts
Search in pages

Robots.txt SEO Module

The robots.txt module in All in One SEO Pack allows you to set up a robots.txt file for your site that will override the default robots.txt file generated on your BBWP site.  By creating a robots.txt file with All in One SEO Pack you have greater control over the instructions you give web crawlers about your site.

All in One SEO Pack generates a dynamic file so there is no static file to be found on your server.  The content of the robots.txt file is stored in your BBWP site database.

Default Rules

The default rules that show in the Create a Robots.txt File box (shown in screenshot above) ask robots not to crawl your core BBWP files.  It’s unnecessary for search engines to access these files directly because they don’t contain any relevant site content.

Adding Rules

The rule builder is used to add your own custom rules for specific paths on your site.  For example, if you would like to add a rule to block all robots from a temp directory, you can use the rule builder to add this rule as shown below.

To add a rule:

  1. Enter the User Agent. Using * will apply the rule to all user agents
  2. Select the rule type to Allow or Block a robot
  3. Enter the directory path, for example /wp-content/plugins/
  4. Click the Add Rule button
  5. The rule will appear in the table and in the box that shows your robots.txt appears

NOTE:  While the robots.txt generated by All in One SEO Pack is a dynamically generated page and not a static text file on your server, care should be taken in creating a large robots.txt for two reasons:

1). A large robots.txt indicates a potentially complex set of rules which could be hard to maintain.

2). Google has proposed a maximum file size of 512KB to alleviate strain on servers from long connection times.