next up previous contents index
Next: 4.3.5 Gatherer enumeration vs. Up: 4.3 RootNode specifications Previous: 4.3.3 Example RootNode configuration

4.3.4 Using extreme values -- ``robots''

 

Using extreme values with the RootNode specification mechanism it is possible to specify a Web ``robot''. We implore the user not to do this. Robots are very inefficient -- they generate excessive load on network links and remote information servers, do not coordinate gathering effort, and will become decreasingly useful over time because they do not focus their content on a specific topic or community. The Harvest RootNode specification mechanism was developed to support gathering needs for topical collections, not to build robots.

  NOTE: As of version 1.4 patchlevel 2, the Gatherer obeys the robots.txt convention.



Duane Wessels
Wed Jan 31 23:46:21 PST 1996