User Tools

Site Tools




The golden rules of backup:

  1. All data that is not backed up must be considered deleted
  2. Only use automatic backup
  3. Test whether your backup is complete an recoverable
  4. If you encrypt your backups, make sure you don't ever lose the key
  5. Simple is beautyful
  6. store backups on different media in different locations
  7. RAID ain't no backup

I had to learn the part about not losing the encryption key the hard way. Trust me, nothing is more frustrating then a perfectly good backup made the day before the crash that you can not decrypt. Make copies of the encryption key! Store the copies on different media in different places!

Simplistic server backup under Linux

Sure there are all kinds of sophisticated backup applications. But in the end you will be glad if you used the most simple, most reliable method as your first line of defence against data loss.

I prefer the good old tar and full backups instead of incremental backup. If you don't have terrabytes of data to protect, this method is sufficient.


ccrypt for encryption

Install ccrypt if you have to transfer your backup via an unprotected channel:

aptitude install ccrypt

Create a key file:

vi /root/.ccrypt.key
chmod 600 /root/.ccrypt.key

<note important>Store the key in multiple safe places! In a key ring. On paper in a safe. Don't lose it, otherwise your precious backup will be a pile of useless random bytes.</note>

ncftp for automated uploads to your FTP space

Most providers of root servers or v-servers offer you a quota on their FTP server, where you can put your backups.

Install ncftp:

aptitude install ncftp

Create a file /root/ftp-access.cfg with your FTP credentials:

user youruser
pass yourpassword

Make sure the file is only root-readable:

chmod 600 /root/.ftp-access.cfg

KISS - Keep it simple stupid

I use the following little script for automated backups to my FTP storage:

TS=$(date "+%Y-%m-%d_%H-%M")
INCLUDE="/data /etc /var/log /home $DBFILE"
FTPHOST=$(grep '^host' $FTPCFG | sed -e 's/^host//' -e 's/ //g')
# Backup will be accessible for root only
umask 0077
# Write to a log file
function write_log {
  /bin/echo `date '+[%Y-%m-%d %H:%M.%S]: '` $* | tee -a $LOG_FILE
write_log "Starting backup of $INCLUDE"
write_log "Dumping database to $DBFILE"
nice mysqldump -A --add-drop-database -u backup > $DBFILE
write_log 'Creating and encrypting backup archive '$BACKUPFILE
nice tar cPj $INCLUDE | ccrypt -k $KEYFILE > $BACKUPFILE
write_log 'Removing database dump'
# Remove old files
write_log "Checking $FTPHOST for files that need to be removed"
REMFILES=$(ncftpls -f $FTPCFG -x "-l1" ftp://$FTPHOST$FTPDIR | grep backup-data | sort | head -n-$KEEP)
write_log "Removing remote files: $REMFILES"
ncftpls -f $FTPCFG -W "rm $REMFILES" -x "-1" ftp://$FTPHOST$FTPDIR
# Transfer the backup file to the FTP-Server
write_log "Starting FTP transfer"
write_log "Removing temporary backup archive $BACKUPFILE"
write_log "Done"

Could be improved: * The script does not recognize if the space is too small for a backup. * No feedback (e.g. via mail) if anything goes wrong

MySQL backup

The script above backs up MySQL databases. See the article on how to backup MySQL databases for details. For the script above to work a MySQL user named backup must exists and the password needs to be stored in a .my.cnf file in the directory of the system user you run the script under.

Doing regular backups

A backup script is worthless if it is not run regularly. So please make sure you run your backup script as a cron job. The easiest way is to put a copy into the /etc/cron.weekly or /etc/cron.daily directory.

<note important>Make sure the filename in the cron directory *does not contain any dots*. Otherwise it won't bew execuded by the run-parts command, which starts the scripts in the cron job directories.</note>

backup.txt · Last modified: 2015/05/17 17:44 (external edit)