andrew.voznytsa at gmail.com
Wed Feb 21 19:16:37 GMT 2007
John Arbash Meinel wrote:
> Andrew Voznytsa wrote:
>> It seems that bzr failed during committing media/Test002.mpg which is
>> about 240 Mb (media/Test001.mpg was about 143 Mb).
>> PC spec attached.
>> bzr worked with shared repository over sftp (sftp server is localhost).
> Thanks for the report. Do you know how much memory was in use when it
bzr ate around 680 Mb and crashed after some time.
I set paging file size to 2 Gb but nothing changed.
> At this point, we have focused on supporting versioning source code, and
> making that fast. So we have the explicit requirement that we can hold 3
> copies of a file in memory. (base, this, other for merging).
I'd mention that large files in multimedia software field are quite
often, for example for regression tests. Some of files might be very
large (HDTV clips). So loading whole file into memory (I'm not speaking
about number of copies) could be just impossible. I believe you know
about such cases so I just want to recall and ask when (if) you plan to
implement support of large files?
> For commit, we expect to have at least 2 copies (and possibly a bit more
> if you have a lot of deltas.
> It shouldn't need to have that many copies of all the files at the same
> time, so if it does we need to fix that.
More information about the bazaar