Robots.txt

From Wikitech
(Difference between revisions)
Jump to: navigation, search
Line 3: Line 3:
 
     RewriteRule ^/robots.txt$ /w/robots.php [L]
 
     RewriteRule ^/robots.txt$ /w/robots.php [L]
  
sends all requests for robots.txt through robots.php. This script checks whether a page ''Mediawiki:robots.txt'' exists on the wiki. If it exists, the content of that page will be sent. Otherwise, the file /apache/common/robots.txt will be sent.  
+
sends all requests for robots.txt through robots.php. This script checks whether a page ''Mediawiki:robots.txt'' exists on the wiki. If it exists, the content of that page will be sent first. The file /apache/common/robots.txt will be sent afterwards.  
  
 
To edit the static file, do the following:
 
To edit the static file, do the following:

Revision as of 19:25, 17 September 2008

The rewrite rule

   RewriteRule ^/robots.txt$ /w/robots.php [L]

sends all requests for robots.txt through robots.php. This script checks whether a page Mediawiki:robots.txt exists on the wiki. If it exists, the content of that page will be sent first. The file /apache/common/robots.txt will be sent afterwards.

To edit the static file, do the following:

  • Edit /home/wikipedia/common/robots.txt
  • Run sync-common-file robots.txt
Personal tools
Namespaces

Variants
Actions
Navigation
Ops documentation
Wiki
Toolbox