[Bulk] Re: Large Binary Files
Martin Pool
mbp at canonical.com
Fri Oct 15 09:31:40 BST 2010
On 15 October 2010 13:04, jbowtie at amathaine.com <jbowtie at amathaine.com> wrote:
> Crucially this just reduces your memory usage to a couple of blocks at
> a given time plus the overhead of the psuedo-file itself. So you
> should be able to handle binary files of an arbitrary size on your
> 32-bit platform just fine. At the same time this should actually
> require fairly minimal changes to the core on-disk format, it just
> needs to recognise the magic psuedo-file during certain operations.
Our current on disk format ought to support handling large files with
constant memory usage, by streaming them through. Some of the
internal interfaces use chunk interators; some don't; some use them
but are probably accidentally concatenating them to single bytes.
A plugin that hacks around this by chopping up the file could be
pragmatic, but this seems like a bit of a hack.
--
Martin
More information about the bazaar
mailing list