robots.txt ==>

standard for robot exclusion

<World-Wide Web> A proposal to try to prevent the havoc wreaked by many of the early World-Wide Web robots when they retrieved documents too rapidly or retrieved documents that had side effects (such as voting). The proposed standard for robot exclusion offers a solution to these problems in the form of a file called "robots.txt" placed in the document root of the web site.

W3C standard.

Last updated: 2006-10-17

Try this search on Wikipedia, OneLook, Google

Nearby terms: Standard Commands for Programmable Instruments « Standard d'Echange et de Transfert « standard deviation « standard for robot exclusion » Standard for the exchange of product model data » Standard Generalised Markup Language » Standard Generalized Markup Language


Loading

Copyright Denis Howe 1985

directoryold.com. General Business Directory. http://hotbookee.com.