Using robots.txt to reduce licensed page views


Webtrends Analytics 8.x
Webtrends Analytics 9.x


The indexing of a web site by robots and spiders results in increased activity in the server logs, leading to an increased usage of licensed page views when running analyses in Webtrends.


For clients who do not value this information it is possible to restrict indexing of the site from these processes. The robots.txt file is a text file that can be configured to instruct robots and spiders not to index the website.

To create a robots.txt file, open a text edit and add the following lines:

# go away
User-agent: *
Disallow: /

Save the file and place it in the root directory of your web site.

More Information

Please refer to the following URL for a more detailed explanation of how robots.txt works: