Out of memory on decompress

Martin Pool mbp at canonical.com
Wed Sep 29 01:23:08 BST 2010


Hi Karl,

Doing a chunked iterator for large files is basically a good way to
proceed here.  Please make the chunk limit a config variable so that
we can test different settings either in the test case or on user
machines. We would be very happy to have you work on it.

You might like to test under 'ulimit' to make sure bzr fails if it
tries to allocate more than a certain amount of memory.

As a way of getting your toe in the water you could work on the bug of
logging some details about where memory is used when we run out:
<https://bugs.edge.launchpad.net/bzr/+bug/551391>.

You can use Meliae
<http://jam-bazaar.blogspot.com/2009/11/memory-debugging-with-meliae.html>
to get some idea where memory is going and John can help you
understand the results.

-- 
Martin



More information about the bazaar mailing list