Out of Memory a bridge too far

Andrew Bennetts andrew at bemusement.org
Sun Nov 13 10:57:16 UTC 2011


Some other thoughts I had:

I wrote:
[…]
>  - and, of course, this greatly mitigates the memory consumption caused
>    by bzr's current implementation that IIRC needs roughly 2-3x the
>    memory of the largest file in the tree.

This statement assumes the primary cost is due extracting texts from a
repository, not the working tree / build commit / merge¹ code.  I think
that's right; at least if it's not, it's almost certainly easier to fix.

[…]
> corner cases, and also IIRC the existing view hooks assume a 1-to-1
> relationship between transformed and untransformed files.

And fixing this would be useful for other hacks too (e.g. presenting a
tarball as a directory, or vice versa), or a directory of NEWS-file
fragments as a single, canonically ordered and formatted NEWS file.

I suspect there are some merges that could be easier if you can change
the representation of a file into multiple files that can be considered
independently (and some merges would benefit from considering other
files changed at the same time when figuring out what changes to apply
to a particular file).

-Andrew.

¹ Although my hypothetical largefiles plugin could register a
  merge_file_contents hook that just marked all simulataneous changes to
  large files as conflicts, skipping any opening of the file contents at
  all.  Actually, you could easily write such a plugin today, although
  I'm not sure it would help much, it doesn't seem merges are the
  currently limiting operation for people.




More information about the bazaar mailing list