Below are provided the basic directives for the robots.txt for Shop-Script users. These directives can be used for excluding non-relevant storefront pages from the list of URLs indexed by search engines.
Disallow: /search/?query= | Search results page. |
Disallow: /compare/ | Product comparison page. |
Disallow: /tag/ | Product filtering by tags. |
Disallow: *&sort=<br> Disallow: */?sort= | Product sorting results page. |
Disallow: /cart/ | Shopping cart page. |
Disallow: /order/ | In-cart checkout page. |
Disallow: /checkout/ | Multi-step checkout pages. |
Disallow: /my/ | Customer accounts. |
Disallow: /login/ | Customer login page. |
Disallow: /signup/ | Customer signup page. |
Disallow: /forgotpassword/ | Password recovery page. |
Disallow: /webasyst/ | Webasyst backend pages. |
The above directives are suitable for online stores available at domain root, i.e. at URLs like https://yourdomain.com/. If your online store URL contains a subdirectory name; e.g., https://yourdomain.com/shop/, then you need to add that subdirectory name instead of the first slash character. For example, Disallow: /tag/ directive should be changed to Disallow: /shop/tag/, etc.
Crawl-delay directive
The load caused by search crawlers can also be reduced using Crawl-delay directive. It sets the minimum period in seconds after which a crawler is allowed to analyze another website page. An example of use:
Crawl-delay: 10
0 comments
No comments yet.