Backups done by the system
There are no real backups of Ourshack systems
We are not completely unprotected though: this page describes the process that makes copies of certain config data on Cat. Other machines do not currently have this sort of protection.
Configuration data
Certain critical data such as the contents of /etc is collected into a tar file each
night and stored in /fs1/cat-backup
- several generations are kept.
This is done by the script /usr/local/libexec/do-cat-config-backup
which reads
a list of files and directories from /usr/local/etc/config-backup-files
The script is run from /etc/periodic/daily/601.config-backup
LDAP data
The data that drives the mail system is held in an OpenLDAP system.
The raw database files are copied as part of the configuration data mentioned above,
but as an added protection the data is exported to a text file each night.
This is done by the script /usr/local/libexec/do-ldap-backup
and the
results are kept in /fs1/ldap-backup
- again, several generations are kept.
This process is run from /etc/periodic/daily/600.ldap-backup
Zope data
The Zope webserver/content management system holds all its data in a database rather
than in flat files. To guard against corruption, this is backud up nightly to
/fs1/zope-backup
by the script /home/zope/bin/do-zope-backup
which is run from the zope user's crontab.
How to make offsite backups
Doing network backups of the complete machines is likely to cause unreasonable amounts of traffic, but copying a few critical areas is certainly possible. The configuration-data tar-file mentioned above makes a good start, along with any critical areas that you are responsible for.
Please do not take copies of the config and LDAP data unless you can protect them from being seen by anyone other than Ourshack admins. Also, there is not much point in taking copies unless you know how to use the data to rebuild the box after a disaster...
The simplest way to copy a load of files is to use tar to make a datastream and
to use ssh to move the data. Here is an example script to generate the backup stream -
put it in your own bin
directory on Cat:
#!/bin/sh # # make-backup-stream # # Create a tar-gzip stream on stdout with backup data # Intended to be used as a fixed command in a .ssh authorized_hosts2 file # # Note that the script does 'cd /' and omits the leading '/' from filenames # # andrew.findlay@skills-1st.co.uk # 3 Feb 2003 PATH=/usr/ucb:/usr/bin:/bin export PATH cd / tar -c -z -f - --exclude usr/local/mailman/archives/private/contra-corner/database \ --exclude usr/local/mailman/archives/private/iee-ec3/database \ fs1/cat-backup/cat-config.tar.00.gz \ fs1/zope-backup/zope-data.tar.00.gz \ usr/local/www/lists.skills-1st.co.uk \ usr/local/www/sites/www.nv-l.org \ usr/local/mailman/archives/private/contra-corner \ usr/local/mailman/archives/private/iee-ec3
To automate the process, you need to get your home machine to make an ssh connection periodically
and run the script to generate the datastream. It is a good idea to use a new SSH key just for this job,
which can be placed in the ~/.ssh/authorized_keys2
file with limits on what it
can be used for, e.g. (all on one line - split here for easier reading):
command="/home/ajf/bin/make-backup-stream", no-pty,no-port-forwarding,no-agent-forwarding,no-X11-forwarding ssh-dss AAAAB3NzaC1kc3MAAACBAMh4kPixjMLktMwi/rvyti36D5yj ... 6Unn6dEzF5lxFuE/rfg+HMTx3MpqP5NQx/YH3d9uc9SsLpIcRw== ajf backup-only key
Finally, you need a script on your home machine to fetch the backup and store it locally. Something like this perhaps:
#!/bin/sh # # get-cat-backup # # Gets a backup from cat.ourshack.com PATH=/usr/ucb:/usr/bin:/bin export PATH BACKUPDIR=/e2/backup/ourshack umask 007 mv ${BACKUPDIR}/cat-backup.tar.02.gz ${BACKUPDIR}/cat-backup.tar.03.gz > /dev/null 2>&1 mv ${BACKUPDIR}/cat-backup.tar.01.gz ${BACKUPDIR}/cat-backup.tar.02.gz > /dev/null 2>&1 mv ${BACKUPDIR}/cat-backup.tar.00.gz ${BACKUPDIR}/cat-backup.tar.01.gz > /dev/null 2>&1 ssh -2 -n -T -x -i ~/.ssh/os-backup-key -l ajf cat.ourshack.com \ > ${BACKUPDIR}/cat-backup.tar.00.gz
I run a backup of this sort from cron, once per week.
Andrew Findlay
28 Jan 2004