Robots.txt

From Wikitech
(Difference between revisions)
Jump to: navigation, search
Line 9: Line 9:
 
* Edit <tt>/home/wikipedia/common/robots.txt</tt>
 
* Edit <tt>/home/wikipedia/common/robots.txt</tt>
 
* Run  <tt>sync-common-file robots.txt</tt>
 
* Run  <tt>sync-common-file robots.txt</tt>
 
Note there is a half-finished robots.php which could be used to more conveniently generate this file; if you feel like finishing it please make sure you update it to the then-current robots.txt file, then update this document. :)
 

Revision as of 19:08, 17 September 2008

The rewrite rule

   RewriteRule ^/robots.txt$ /w/robots.php [L]

sends all requests for robots.txt through robots.php. This script checks whether a page Mediawiki:robots.txt exists on the wiki. If it exists, the content of that page will be sent. Otherwise, the file /apache/common/robots.txt will be sent.

To edit the static file, do the following:

  • Edit /home/wikipedia/common/robots.txt
  • Run sync-common-file robots.txt
Personal tools
Namespaces

Variants
Actions
Navigation
Ops documentation
Wiki
Toolbox