bzr-svn fetching all revisions even though -r 500 passed to branch command

Jelmer Vernooij jelmer at samba.org
Mon Oct 29 13:39:24 GMT 2007


Am Montag, den 29.10.2007, 12:40 +0100 schrieb Nicholas Allen:
> I tried branching our subversion repository with bzr-svn. I know there 
> is a memory usage issue with the subversion libraries so I tried the 
> workaround previously mentioned of passing -r 500 to the branch command. 
> bzr-svn still tried to fetch all subversion revisions even though I 
> passed this option and of course my computer ran out of memory. This 
> means that it is currently impossible to branch any reasonable sized svn 
> repo. As a workaround I can ctrl-c it every 2000 revisions and then 
> start it again and it seems to carry on where it left off.
There are two things that bzr-svn has to do:

 * Fetch revision metadata for all revisions in the repository
 * Fetch individual revision content for all revisions in the branch
you're cloning

The data from the first step will be cached in a database in
~/.bazaar/svn-cache when you first connect to a repository.

The number of revisions fetched in the second step can be limited using
the -r argument to various commands.

> Is there a better way to do this? Would it be possible to code a 
> workaround in bzr-svn so that it only fetches revisions in blocks of 500 
> or so to get around the svn memory leaks?
The workaround that the Mercurial folks have implemented is invoking
"svn log" and parsing the output. I'd be happy to include patches that
make it possible to use a similar workaround for bzr-svn.

Cheers,

Jelmer
-- 
Jelmer Vernooij <jelmer at samba.org> - http://samba.org/~jelmer/
Jabber: jelmer at jabber.fsfe.org



More information about the bazaar mailing list