Quick question about the command find.

Patton Echols p.echols at comcast.net
Fri Jun 11 08:30:53 UTC 2010


Preston Hagar wrote:
> On Fri, Jun 4, 2010 at 9:18 AM, Maxime Alarie <malarie at processia.com> wrote:
>   
>> I know its dangerous, but im okay with it..   here is why
>>
>>
>>
>> /dev/sda1:/var/backups  synchs  to  /dev/sdb1:/backup  every night.  AND
>>  once a week : /dev/sdb1:/backup synchs to  this external  usb disk
>> /dev/sdc:/backup (offsite)
>>
>>
>>
>> Sda1 and sdb1 are 250GB disks…  sdc is 1TB.  I want to erase 15 days+ old
>> backups from  sda1 and sdb1 J
>>
>>
>>
>> I will try your commands. Thanks much for the help guys.
>>
>>     
>
> I know this doesn't directly answer your question, but there might be
> an easier/safer way to rotate your backups.  It really depends on how
> you do your backups, but from the sound of things, you might be using
> rsync and a cron job.  Anyway, here is a nice little backup script I
> use that uses cron and rsync.  I have tried to comment throughout it
> so it will be clear what each part does:
>
>
> #!/bin/bash
>
> # log output to file
> exec > /var/log/backup.log
>
> echo "*****Backup Started: " `date`
>
> # set this variable to the directory you want to place your backups in
> backup_dir=/var/backups/
>
> echo "Rotating Snapshots"
> cd $backup_dir/
>
> #remove the oldest backup and advance all other days backups by one
> rm -rf 15
> mv 14 15
> mv 13 14
> mv 12 13
> mv 11 12
> mv 10 11
> mv 9 10
> mv 8 9
> mv 7 8
> mv 6 7
> mv 5 6
> mv 4 5
> mv 3 4
> mv 2 3
> mv 1 2
>
> # Copy the current to the snapshot head
> # the cp -al creates hard links during the copy, so if a file remains
> unchanged, it is only stored
> # on disk once.  This allows for many more days of backups since you
> are only storing each changed
> # copy of a file and not multiple copies of identical files
> cp -al $backup_dir/current $backup_dir/1
>
> # the touch commands will update the timestamp on the backup
> directories so you can easily find out what date/time
> # a given backup was made
> touch $backup_dir/current
> touch -r $backup_dir/current $backup_dir/1
>
> echo "Archiving to backup folder"
> # backup the /home directory.  You can change that to whatever
> directory you need to backup
> # or you can have multiple rsync lines for each directory you need to back up.
> # On some machines, I have done the opposite, I backup / and use an
> --exclude-from file to exclude the directories I do _not_ want
> rsync -av /home/ $backup_dir/current/
>
> echo "********Backup Completed: " `date`
>
>
> # END SCRIPT
>
> As an example, on one server I use this script on I can store 40 days
> of backups of 230 GB of data, using only 372 GB of space for the
> backups.  This is because of the cp line that uses hard links.  How
> many days you can store really depends on how much your data that you
> are backing up changes each day.
>
> Hopefully this might help.  With this script, you can easily
> increase/decrease the number of days by modifying the mv and rm lines.
>  You could also put the mv lines in a for block if you wanted to clean
> it up a bit, I just have never got around to it.
>
> Preston
>
>   
I know I am a little late to this party, but wanted to say thanks, 
Preston, for the script.  I had read somewhere about this basic 
technique, lost the link before implementing and was planning to start 
searching, and there is your post, with script!  I appreciate it.

-- PE





More information about the ubuntu-users mailing list