MemoryError on commit with large file

Robert Collins robertc at robertcollins.net
Fri Oct 5 05:59:41 BST 2007


On Mon, 2007-10-01 at 20:10 -0700, Joel Hardi wrote:
> 
> Are there plans to accommodate arbitrarily large files, either by  
> diffing using file streams or just not diffing files at all if they  
> meet some size threshold, match a glob like *.dv or when they throw
> a  
> MemoryError (I'd be perfectly happy with that since these are binary  
> files that I don't need to diff)?
> 
> FYI, I've also been running Mercurial on the same file set for the  
> past week on the same system; it allows me to commit and make local  
> branches on these files, but runs out of memory on network copies,  
> which is what brought me to bzr.

I think we all agree we should handle these situations better.

If you have some python skills I'd be delighted to guide you through
addressing this error.

Basically I think the right approach is to:
 * try the fast path
 * catch MemoryError
 * fallback to a slower file-based approach

This will work for most cases, and will address the number of copies
problem substantially, but we may still fall down on merge, which is
somewhat trickier to reduce memory usage on.

-Rob

-- 
GPG key available at: <http://www.robertcollins.net/keys.txt>.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: This is a digitally signed message part
Url : https://lists.ubuntu.com/archives/bazaar/attachments/20071005/eb1595ca/attachment.pgp 


More information about the bazaar mailing list