"Error splicing file: File too large"

Joel Rees joel.rees at gmail.com
Sun Sep 18 22:37:52 UTC 2016


On Mon, Sep 19, 2016 at 6:53 AM, Dave Stevens <geek at uniserve.com> wrote:
> On Mon, 19 Sep 2016 06:44:19 +0900
> Joel Rees <joel.rees at gmail.com> wrote:
>
>> [...]
>> (And, apparently, the OP's device was formatted in a variant of FAT32
>> that does handle greater than 4GB files, since it errored out at the
>> 99% mark. Or something.)
>
> perhaps I can clarify. The display didn't say 99%. Instead it was a
> file copy operation counting down from 67K files. It had got to 500-odd
> to go before crapping out. I don't necessarily think this meant it had
> actually finished copying that many files and in fact when I looked at
> the transferred file size after the crash it was about 4GB.

One of those cases where the small files get copied first, maybe.
(Copy strategies used by system utilities is another point subject to
disagreement.)

But it sounds like you were zipping directly to the USB device?

In your original post it sounded like you were zipping in something
more stable and copying the result to USB, which is the user strategy
I'd recommend. if it's possible.

You really can't depend on the file system of USB devices as they are
sold, if you want to store single files larger than 2G (whether JEDEC
or IEC Gig).

Some USB devices can be reformatted to a known file system, and you
can choose one that supports larger than twice your target max file
size.

But you also want to consider error rates in transmission and media.
I'm not sure I'd want to keep a zip archive in the 16G range,
depending on what is in the archive.

-- 
Joel Rees

I'm imagining I'm a novelist:
http://joel-rees-economics.blogspot.com/2016/04/economics-101-novel-rough-draft-index.html




More information about the ubuntu-users mailing list