Pathetic performance with large files

Martin Pool mbp at canonical.com
Thu Nov 23 08:26:13 GMT 2006


On 16/11/2006, at 7:01 PM, MDK wrote:

> I'm new to bzr, and I very much like the ideas behind it. However, I'm
> currently having a big problem trying to setup a rather large
> (private) repository. The tree consists of several small text files
> and a few large (~70 - 140MB) binary ones -- totalling around 700MB.
>
> It seems that operations such as "checkout" or "commit" (local or
> remote) take around (2.5 * repository size) of memory. With repository
> of 700MB, that easily brings any machine to it's knees -- locking the
> whole workstation, swapping in and out for 2h.
>
> Is there anything I can do now to improve the situation, or are there
> any plans to improve support for large repos?

Right at the moment all you can do is try to debug it, avoid adding  
the large files, or get more memory.  There's no in-principle reason  
why it should use memory proportional to the whole repo size: we may  
be caching something for too long.

However in the near term we aren't planning to support single files  
that are too large to hold two copies in memory at a time.

-- 
Martin







More information about the bazaar mailing list