Compress old revisions

From Wikitech
Revision as of 08:36, 5 November 2004 by Jamesday (Talk)

Jump to: navigation, search

There is a script to compress individual old revisions:

  • cd /home/wikipedia/common/php-new/maintenance
  • nice php compressOld.php en wikipedia -t 1 -c 100 5467442
  • -t 1 : the time to sleep between batches, in seconds
  • -c 100: the number of old records per batch
  • 5467442: the old_id to start at, usually 1 to start. Displayed as it runs, if you stop the job, note the last value reached and use it to resume the job later. You get a warning for every record which has already been converted, so don't start much below the point you need.

Progress for en wikipedia:

  • As far as 5467442 of about 6.7 million
  • resume off peak with nice php compressOld.php en wikipedia -t 1 -c 100 5467442
  • batch size of 5000 is OK off peak

Completed. Left about 40GB lost to fragmentation. Will take a table rebuilt to free it but that can't be done on Ariel using an InnoDB table because it will add 40GB of space to the tablespace for the copy.

Personal tools
Namespaces

Variants
Actions
Navigation
Ops documentation
Wiki
Toolbox