Robots.txt

From Wikitech
Revision as of 23:45, 12 September 2008 by JeLuF (Talk | contribs)

Jump to: navigation, search

The rewrite rule

   RewriteRule ^/robots.txt$ /w/robots.php [L]

sends all requests for robots.txt through robots.php. This script checks whether a page Mediawiki:robots.txt exists on the wiki. If it exists, the content of that page will be sent. Otherwise, the file /apache/common/robots.txt will be sent.

To edit the static file, do the following:

  • Edit /home/wikipedia/common/robots.txt
  • Run sync-common-file robots.txt

Note there is a half-finished robots.php which could be used to more conveniently generate this file; if you feel like finishing it please make sure you update it to the then-current robots.txt file, then update this document. :)

Personal tools
Namespaces

Variants
Actions
Navigation
Ops documentation
Wiki
Toolbox