Dumps

From Wikitech
Revision as of 21:23, 19 April 2006 by Brion (Talk | contribs)

Jump to: navigation, search

Docs for end-users of the data dumps at MetaWikipedia:Data dumps.


Updated notes for Wikimedia site setup, 2006-01-22.

Contents

Top-level procedure

Florida

The set of databases is split into four chunks, which can be run independently.

On srv31 as root in a screen session, run:

 # /home/wikipedia/src/backup/backup-pmtpa1 2>&1 | tee some-log-file

On benet as root in a screen session, run:

 # /home/wikipedia/src/backup/backup-pmtpa2 2>&1 | tee some-log-file

On srv31 as root in a screen session, run:

 # /home/wikipedia/src/backup/backup-pmtpa3 2>&1 | tee some-log-file

On benet as root in a screen session, run:

 # /home/wikipedia/src/backup/backup-pmtpa4 2>&1 | tee some-log-file

Files are saved onto benet; make sure there's ~50 gigabytes free before running.

User-visible files appear at http://download.wikimedia.org/

Korea

On amaryllis as root in a screen session, run:

 # export WIKIBACKUP=1
 # /usr/local/backup/backup-yaseo 2>&1 | tee some-log-file

Files are saved on amaryllis.

User-visible files appear at http://download-yaseo.wikimedia.org/ (this needs fixing)


Locks and logs

At the moment the new dump script doesn't use lock files, so make sure you don't run two sessions on the same cluster. Lock files will be added so it can be automatically started in some reasonably safe fashion...

Raw output from the script will go into the file you tee output into. A separate text log isn't kept at the moment, but status information is saved into HTML files for public consumption:

  • <base>/
    • index.html - List of all databases and their last-touched status
    • <db>/
      • <date>/

Sites are identified by raw database name currently. A 'friendly' name/hostname can be added for convenience of searching in future.

Error handling

If a dump step returns an error condition, the runner script should detect this and mark the item as "failed" on the HTML pages. The runner will keep on trying other steps and remaining databases, unless the runner script itself fails somehow.

It may be wise to add e-mail or other notification of errors.

Programs used

The dump runner script is available in our cvs, in the 'backup' module, as WikiBackup.py.

  • mysqldump
  • dumpBackup.php, dumpTextPass.php to generate the XML dumps
    • Requires working PHP 5 and MediaWiki installation on amaryllis/srv31! Don't remove from mediawiki-installation dsh group!
    • need the XMLReader PHP extension, zlib, and bzip2 enabled
    • using the ActiveAbstract MW extension for Yahoo's wacky stuff
    • 7za (of p7zip) must be installed and in the path

Other missing features

Currently, image tarballs are still not being made.

MD5 checksum files aren't being generated.

Static HTML dumps might also be included in this mess in future?

Notes

Not all error detection is probably working right now. Failures on the mysqldump runs are not detected. Tar failures are not detected.

Failures of dumpPages.php should be detected, but indirectly from the failure of mwdumper to parse its XML output.

The mysql dumps and page XML dump pull from bacon, as configured in one of the higher-level backup scripts. Currently bacon is *NOT* being stopped from replication...

  • The page XML dumps should be consistent, all three outputs draw from one input, which is drawn from one long SQL transaction plus supplementary data loads which should be independent of changes.
  • The other SQL dumps are not going to be 100% time-consistent. But that's not too important.

grantswiki and internalwiki are special-cased so they _should_ get completely backed up into /var/backup/private instead of the public dir.

Personal tools
Namespaces

Variants
Actions
Navigation
Ops documentation
Wiki
Toolbox