Setting up Ubuntu with regular backups
Stephen R Laniel
steve at laniels.org
Mon Jun 6 14:59:34 UTC 2005
On Mon, Jun 06, 2005 at 04:39:28PM +0200, Arjan Geven wrote:
> Now, the obvious software to do such thing would be rsync I guess
> (right?).
Rsync is a good start, but I use rdiff-backup instead --
which is built atop rsync. It can do local-to-local backups,
local-to-network, and network-to-local. It stores reverse
diffs (using the rdiff tool, or at least some library
equivalent to it), which means that it always stores the
most recent version of a file, and then stores all the
changes necessary to recover old versions. Which is nice,
because presumably most of the time you'll want to recover
the most recent version. And since the most recent version
is stored as a raw file, you can just do a straight copy
('cp') to retrieve the newest version.
So I use rdiff-backup, then stick it in a cron job (cron
being the tool that schedules commands to execute
periodically). I use this with my clients also, whose
machines I can then sign into remotely to check how the
backup is going/has gone.
There's no reason you couldn't do all of this graphically.
But one virtue of the command-line approach -- among many,
I'd say -- is that you can sign into your machine and check
what's going on from anywhere in the world with a minimum of
tools; you can just use PuTTY. With a graphical tool, it
seems to me that you'd either have to have a
highish-bandwidth connection to check, or that the tool
would have to write out logs which you'd then check over
ssh. But then again, if remote checking isn't something you
need to do, this won't be very convincing.
Finally, I think the rdiff-backup command-line syntax is
fairly straightforward. The command I use to do the backup
is
rdiff-backup -v5 --include-globbing-filelist $includefile --exclude '**' / $remoteMachine::$remoteBackupDir
which says, "Take the file called $includefile, grab all the
files listed inside of it -- which in this case are
/boot
/etc/
/home/
/mnt/caviar/home
/usr/src
/var/mail/
/var/www/
-- and exclude everything else. Start the backup at the root
directory (/), and backup to $remoteBackupDir on
$remoteMachine." (The dollar-sign syntax is a bit more
arcane than it needs to be for the purpose of this example,
only because that command is embedded inside of a script
that does other things like write out some text to a
logfile.)
All that said, I've heard other recommendations from others.
Someone on the linux-elitists mailing list recommends just
using tar -- the standard, old-school Linux archiving
program. One of his big reasons is that backup should be
done using a bare minimum of tools, so that if your machine
totally craps out or you need to recover it using (who knows
why) minimal hardware like a floppy disk, you'll have all
the tools you need. It's sort of like preferring a car
lacking fuel injection, because you want to be able to fix
it with a hammer and a screwdriver.
AMANDA is the more hardcore backup approach. I'm sure others
have their own suggestions.
--
Stephen R. Laniel
steve at laniels.org
+(617) 308-5571
http://laniels.org/
PGP key: http://laniels.org/slaniel.key
More information about the ubuntu-users
mailing list