Dumps/Snapshot hosts

From Wikitech
< Dumps
Revision as of 08:58, 20 January 2012 by ArielGlenn (Talk | contribs)

Jump to: navigation, search

Contents

Snapshot (XML dumps generation) cluster information

Hardware

These hosts generate the XML dumps. For information about the hosts that serve them, see Dumps/Dump servers.

We have two mini snapshot clusters.

In Tampa:

  • snapshot1: operational, PowerEdge 1950, Ubuntu 10.04, 8GB RAM, 2 quad-core Xeons, 80GB HD
  • snapshot2: operational, PowerEdge 1950, Ubuntu 10.04, 8GB RAM, 2 quad-core Xeons, 80GB HD
  • snapshot3: operational, PowerEdge 1950, Ubuntu 10.04, 8GB RAM, 2 quad-core Xeons, 80GB HD
  • snapshot4: operational, PowerEdge R815, Ubuntu 10.04, 8GB RAM, 4 8-core Opterons, 2 80GB HDs

In D.C.:

  • snapshot1001: base install done, PowerEdge R815, Ubuntu 10.04, 64GB RAM, 4 8-core Opterons, 2 80GB HDs
  • snapshot1002: base install done, PowerEdge R410, Ubuntu 10.04, 16GB RAM, 2 6-core Xeons, 500GB HD
  • snapshot1003: base install done, PowerEdge R410, Ubuntu 10.04, 16GB RAM, 2 6-core Xeons, 500GB HD
  • snapshot1004: base install done, PowerEdge R410, Ubuntu 10.04, 16GB RAM, 2 6-core Xeons, 500GB HD

Ordinarily only one cluster will be running dump jobs at a time; the other is on standby in case of various failures. The two beefier servers (with 4 8-core cpus) are dedicated machines for the en wikipedia dumps; as with the other hosts, one of them is in operation and the other is in standby.

Currently running

Monitors:

  • snapshot1 -- current monitor node:
    /bin/bash ./monitor wikidump.conf.monitor

Worker nodes:

  • snapshot1 -- currently running 3 worker processes for bigger wikis out of /backups-atg, via
    python ./worker wikidump.conf.bigwikis
  • snapshot2 -- running 4 processes for small wikis out of /backups-atg, via
    ./worker
  • snapshot3 -- runs adds/changes dumps from cron as user backup
  • snapshot4 -- running en wiki dumps via
    python ./worker.py --configfile wikidump.conf.enwiki enwiki

Other tasks

  • snapshot1 -- as user backup from cron, /backups-atg/dumpcentralauth.sh every two weeks to dump the central auth tables
  • snapshot1 -- as user backup from cron, /backups-atg/create-rsync-list.sh to generate list of XML dump files once a day to be mirrored by other organizations
  • snapshot1 -- as user datasets from cron, /usr/local/bin/daily-pagestats-copy.sh to copy over pagecount data from locke to a publically accessible web dir once an hour
Personal tools
Namespaces

Variants
Actions
Navigation
Ops documentation
Wiki
Toolbox