Backing up to an external FAT32 disk
Roby
electricalsciences at adelphia.net
Fri Aug 24 12:58:47 UTC 2007
Nils Kassube wrote:
> OK, this thread is very old, but I just had the unfortunate opportunity to
> recover data from a broken backup file which was created according to my
> suggestion below.
>
> Rashkae wrote:
>> Nils Kassube wrote:
>> > Liam Proven wrote:
>> >> I am trying to backup a 98% full RAID array (some 100GB of stuff)
>> >> onto a 400GB FAT32 USB2 external hard drive. I don't have space on
>> >> the RAID itself to create the archive, nor on my 3G root FS.
>> >>
>> >> I tried (in my /media directory, the mountpoint for both the RAID
>> >> and the USB drive):
>> >>
>> >> tar -cvf usbdrive/raid.tar raid/
>> >>
>> >> This worked but I forgot one detail. FAT32 has a max file size of
>> >> about 4G. So when the archive got to 4G, it barfed.
>> >
>> > Try this:
>> >
>> > tar c raid | split -d --suffix-length=3 --bytes=1000m -
>> > usbdrive/raid_
>> >
>> > That will give you files of 1000MB size with filenames
>> > usbdrive/raid_<nnn> with <nnn> being numbers counting from 001.
>> >
>> > You can test the archive with:
>> >
>> > cat usbdrive/raid_* | tar tv
>> >
>> > And you can restore a file with:
>> >
>> > cat usbdrive/raid_* | tar x <filename>
>> >
>> >
>> > Nils
>>
>> This is the right approach, but I would use a command more like:
>>
>> tar -cz raid | split -b 2000m - /media/usbdrive/raid.tar.gz.
>>
>> And as already mentioned, to restore:
>>
>> cat /media/usbdrive/raid.tar.gz.* | tar -xz (run this in the location
>> where you want the files restored.)
>
> This extra compression is a bad idea, and I want to explain why. If you
> split compressed data, you can only recover data from the beginning of
> the entire set of data up to the position where an error occurs (missing
> part or I/O error). If you have an uncompressed tar file split into
> several parts, you can recover at least some data from each single part
> if one of the parts is missing or corrupted. Well, I didn't know this
> until I had the necessity to recover vital data from broken parts.
>
> In my case I had an uncompressed tar file of about 8GB which should be
> saved to DVDs. I used split to make it 3 parts. The parts were written to
> DVDs but unfortunately I didn't verify the DVDs. When it was time to
> restore the files from the DVDs, every single file had read errors
> somewhere. Using the dd command I could read the data following the
> broken blocks which resulted in 6 none consecutive parts of the original
> tar file. With the comands
>
> tar xf part1
> tar xf part2
> ... etc.
>
> I could recover data from every single part, although tar complained about
> broken data at the beginning and end of most parts. Finally only very few
> files were really lost. If I had used compression, I would have lost most
> of the files.
>
>
> Nils
Have a look at afio. It compresses individual files AND THEN adds each one
to the archive file it creates ... tar first builds the giant file and then
compresses it. I lost a lot of data due to an error in a compressed
tarball.
More information about the ubuntu-users
mailing list