Thank God for backups!

Daniel Carrera daniel.carrera at zmsl.com
Sat Apr 22 12:20:40 UTC 2006


Vincent Trouilliez wrote:
>>If, instead, you make full backups every 60 days, then 100GB will give 
>>you about 11 months of backups. On the other hand, if you did full 
>>backups every 14 days, then 100GB would last you only 3 months.
> 
> Ah, so the backup will just grow in size forever then ! :-O
> 
> I don't want a backup to go back in time 3 years ago and restore
> particular files.

Yes. You could, of course, delete them regularly. You could setup a cron 
job to delete (say) anything older than 2 years. I can help you with 
that if that sounds interesting.

> All I want is a snapshot, as current as possible,
> of /home, so that if the system crashes, I can restore my data as it was
> just before the crash, not as it was last week or last year, so as to
> lose as little data as possible.

You may feel differently when you realize that you deleted the file last 
Tuesday and now you want it. A perfect example is the password file 
example that started this thread. I could have deleted ~/.kde last week 
and not realized that this was important until today.

I *strongly* suggest that you keep more than one day's snapshot. A week 
should be the bare minimum. I suggest you go fo 1 month. We can make it 
work with 1 month, and you'd only need ~40GB of storage.


> Do an initial full back-up, then:
> 
> 1) Monday to Saturday : automatic incremental backups at night, send it
> to a dedicated headless machine via a high-speed/Gigabit Ethernet
> network.
> 2) Sunday: delete all previous backups, and replace with fresh full
> backup.
> 3) back to 2)

Better yet:
* Sunday - full backup
* Monday - delete all backups older than last Saturday.
* Daily incremental backups.

The problem with deleting the backups on Sunday is that, on Sunday you 
effectively have *no* backups at all. With my scheme, the most sensitive 
day is Monday and on Monday your backups are 1-day old.

Maybe you'll consider going for a 2-week scheme? That way, on any given 
day you'll be able to get backups at least 1 week old.

> because the most recent backup is the only one I am going to want
> when restoring the system after a crash anyway.

In real life:
* You may want a file that you deleted 2 days ago (2 days is not a long 
time).
* A crash is not the only reason why you might want a backup. Accidental 
deletion is a more common cause for data loss than crashes (when was the 
last time your disk crashed?). Accidental deletion is potentially more 
grave because you might not find out about it for several days. A crash 
is "easy" because at least you find out right away.

> That's also means I don't need/want to compress the data.

Yes, you do want to compress data. Trust me on that. You lose *nothing* 
by compressing data, and you save a lot of disk space and agravation 
when you realize that you deleted your critical passwords file last Tuesday.

> It would take
> hours on my machine (I don't have a "Niagara" processor unfortunately...
> just an old 1.5GHz CPU). 

It might be worth doing a benchmark. Hard disks are slower than CPUs. 
Compressing the data might actually be faster on an old computer because 
the disk doesn't have to write as much. It's hard to know without 
testing it.

But why would this matter? Set the backups to run at 2AM every morning. 
Who cares if it takes from 2AM to 7AM if you only need the computer at 8AM?

> Maybe these features could be added to sbackup, but I guess in the
> meantime I have no choice but to get my hands dirty and take the time to
> learn how to use rsync and bash scripting to do what I want to do
> then :-(

I can help you setup a cron job to delete backups older than x-days. But 
first let's figure out what backup strategy makes sense for you.

Cheers,
Daniel.
-- 
      /\/`) http://opendocumentfellowship.org
     /\/_/
    /\/_/   A life? Sounds great!
    \/_/    Do you know where I could download one?
    /




More information about the ubuntu-users mailing list