Make a database dump

From Wikitech
Revision as of 21:41, 28 November 2005 by Server inventory (Talk | contribs)

Jump to: navigation, search

This explains how to make a wiki dump, which is not the same thing as a full database backup. The xml dumps will then be available on http://download.wikimedia.org/ .


Contents

XML dumps generation

The backup should be run on the dump holder (benet as of August 26th 2005). As root simply start the backup script:

benet# /home/wikipedia/bin/backup-all

Dumps are currently placed in /var/backup with public and private directories below that.


how it works

backup-all call backup-site with the correct parameters needed for each WikiMedia project (quote, tionary, books, pedia, news).

The real jobs is handled by backup-wiki that dump several tables (see 'included' section below). The xml dump itself is generated with MediaWiki dumpBackup.php maintenance script.

A list of titles in the MAIN namespace is generated for each wiki (all_titles_in_ns0.gz).

Once done, symbolic links are set as:

pages_current.xml.gz
pages_full.xml.gz


included tables

public
site_stats, image, oldimage, pagelinks, categorylinks, imagelinks
private
user, watchlist, ipblocks, logging, user_rights, archive

not included

  • The database configuration tables in the mysql database
  • Any non-wiki tables (OTRS, bugzilla and such)
  • Some useful wiki tables:
    • recentchanges
    • searchindex
    • imagelinks (so all use info for images will be lost from image *description pages)
    • interwiki
    • linkscc (so DB load will be high while it's rebuilt)
    • math
    • objectcache
    • querycache
    • hitcounter
    • user_newtalk

Template:PDDev

Personal tools
Namespaces

Variants
Actions
Navigation
Ops documentation
Wiki
Toolbox