2GB limit
Robert Collins
robert.collins at canonical.com
Mon Oct 4 02:22:38 BST 2010
On Mon, Oct 4, 2010 at 2:10 PM, Stephen J. Turnbull <stephen at xemacs.org> wrote:
> Maritza Mendez writes:
>
> > 2. Does anyone know if any other dvcs system has solved the VM problem? If
> > so we might put our "big file" projects in git or Mercurial until bzr can
> > handle them.
>
> In theory, git should have no problems handling large files as long as
> you don't pack the repo (since it just compresses and uncompresses the
> file as a stream using zlib and OS calls -- if the OS and zlib can
> handle the file, git can), and it's probably OK packing the repo as
> well for the same reason -- it needs the packfile's index in memory,
> but not the packfile itself. Except for doing fastimport stuff, I
> don't think git ever needs a whole content file in memory.
IIRC git mmaps the file to do delta region expansion during pack. I'd
expect that to have the same issue (and in fact be unable to handle 2G
files at all for the same reason. IMBW.
> I believe Mercurial works the same way (ie, it treats content files as
> streams), but I've never had the need to worry about it.
hg converts files being committed to a list of lines and does a line
based diff against the last version of the same file.
-Rob
More information about the bazaar
mailing list