|> Home > Documentation > Latest documentation > Global config directives > RobotHost|
Defines a list of hostnames that will be recognised as crawler robots (search engine spiders).
This directive is only available for use in the global (interchange.cfg) configuration file, and will affect all websites running under the Interchange installation. It will not work in a website's local (catalog.cfg) configuration file.
This directive defines a list of hostnames that will be recognised as crawler robots (search engine spiders). Requests coming from the listed hostnames will cause Interchange to alter its behaviour to improve the chance of Interchange-served content being crawled and listed.
This directive accepts a wildcard list; The "*" character represents any number of characters. and the "?" character represents any single character. For example, "208.146.26.*" would match "188.8.131.52" through "184.108.40.206".
If a client is recognised as a robot, the following will be performed by Interchange: