MemoryError with versionning 100+MB file
John Arbash Meinel
john at arbash-meinel.com
Thu Dec 20 14:39:58 GMT 2007
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Alexander Belchenko wrote:
> My coworker try to versionning 1GB disk ISO image, gzipped one has the
> size about 120MB. He successfully commit 2 revisions and now stuck with
> MemoryError while reading knit during commit.
>
> Is there possible to create some workaround now? These big files is
> purely binary, is it possible to exclude read knit phase at all?
>
>
It still creates a delta against the previous text. Which isn't really
something I think you want to give up when you hvae a 120MB file. (Do you want
the repository to grow by 120MB for every commit?)
I suppose if it is gzipped it might grow by that much anyway.
I will say.... is versioning an ISO image really the right way to do what they
want?
There isn't really a way that you can externally force a new text to be
written, rather than creating a delta against the parent.
I guess we have a flag in KnitVersionedFile.delta = True/False (which is how
revisions.knit is always fulltexts.)
So if you wanted to hack it in, you could have a plugin which did something like:
from bzrlib.repofmt import pack_repo
_fulltext_ids = set(['big-file-file-id'])
_orig_get = pack_repo.KnitPackRepository.get_weave_or_empty
def get_fulltext_weave(self, file_id, transaction):
w = _orig_get(self, file_id, transaction)
if file_id in _fulltext_ids:
w.delta = False
return w
pack_repo.KnitPackRepository.get_weave_or_empty = \
get_fulltext_weave
You may need to do a similar thing for
bzrlib.repository.Repository.get_weave_or_empty
(if they are using Knits and not Packs)
John
=:->
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFHan6+JdeBCYSNAAMRAlmhAJ4865NMTn0qgKT3IX1vK8YQe/n8GwCfc01k
XmJODySZVsg7+vdRZPWFIJI=
=3J9l
-----END PGP SIGNATURE-----
More information about the bazaar
mailing list