Attn: Jelmer & David (was [Fwd: [Rails-core] Re: ERROR on Rails Subversion repository])

Jelmer Vernooij jelmer at samba.org
Wed Sep 19 14:46:07 BST 2007


Hi Hendy,

Am Mittwoch, den 19.09.2007, 18:22 +0700 schrieb Hendy Irawan:
> Although I'm not sure about the exact culprit, there is possibility
> that this happens not only on Rails SVN server, but also a bunch of
> other SVN servers in the world. Be it misconfigured, having an old
> version of library, or something.
If this is not a configuration issue, it should ideally be fixed in the
Subversion client library so that all users of libsvn can benefit from
it.

> From the description, and from experiments, it seems true that
> limiting does have the "nice" effect of avoiding these errors, which
> means that can probably do a workaround.
> 
> My proposal is that bzr-svn will issue "svn log -v" in bulk batches,
> i.e. 1000 revisions at once. If it can work out, then it should work
> more reliably with "buggy" SVN servers. And the running performance
> shouldn't be that bad. Considering that even in a 10,000 revisions
> repository, we'll only do that 10 times. And that is still better than
> try to do that once, and fail. :-(
Limiting the number of revisions fetched to a 1000 per time still causes
this error. Anything over 600 seems to break, which seems rather
arbitrary. I also bet that if there are long commit messages that it
breaks even earlier.

> I'll also try joining the Subversion list to see if anyone comes up
> with a better solution. But even if Subversion will have  a fix in the
> future, there's no guarantee that everyone in the world will be
> rushing to update it... So a workaround is still needed
> unfortunately. :-(
I'm not sure this is a misconfiguration in the Subversion side of
things. The rubyonrails server returns the following headers when I run
`svn log':

HTTP/1.1 200 OK
Date: Wed, 19 Sep 2007 13:27:54 GMT
Server: Apache
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
Served-By: Joyent
MS-Author-Via: DAV
Vary: Accept-Encoding
Content-Encoding: gzip
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
MS-Author-Via: DAV
Connection: close
Transfer-Encoding: chunked
Content-Type: text/xml; charset="utf-8"

Notice that MS-Author-Via: DAV and Served-By: Joyent are repeated.

It almost seems like it's adding a new header for every x kilobytes of
data or something that is returned. Requesting 1000 revisions results in
these two headers being repeated 48 times, whereas requesting 500
revisions results in them being repeated 26 times.

I don't want to degrade the performance of bzr-svn to work around a
single SVN repository with a broken configuration, but I can live with a
knob that allows setting the number of revisions fetched with
svn.ra.get_log().

Cheers,

Jelmer
-- 
Jelmer Vernooij <jelmer at samba.org> - http://samba.org/~jelmer/
Jabber: jelmer at jabber.fsfe.org
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 307 bytes
Desc: Dies ist ein digital signierter Nachrichtenteil
Url : https://lists.ubuntu.com/archives/bazaar/attachments/20070919/27a89585/attachment.pgp 


More information about the bazaar mailing list