[MERGE] [BUG #165061] Force http.readv() to a maximum number of range requests.

Vincent Ladeuil v.ladeuil+lp at free.fr
Tue Nov 27 08:52:35 GMT 2007


>>>>> "john" == John Arbash Meinel <john at arbash-meinel.com> writes:

    john> In: https://bugs.launchpad.net/bzr/+bug/165061
    john> Robert noticed that when working with his pack branch, it actually ended
    john> up downloading the same data multiple times.
    john> It happened because with the first pull, it tried to split up the request
    john> into to many sections, which triggered the 'issue 1 range for all readv()'
    john> code path. And I believe there is a small pack issue about requesting a
    john> piece at the beginning of the file, as well as all the signatures at the
    john> end of the file.

    john> Anyway, this patch should make Bazaar friendlier when trying to download a
    john> large subset of a pack repository. If we were downloading the whole thing,
    john> the ranges would collapse down into a single range.

    john> Robert noticed that he was getting a collapse down into 671 ranges. This
    john> code should force our requests to always be less than 200 ranges.

As long as _max_readv_combine get the right value. And when it
get the right value, it already collapse to *2* ranges.

    john> This code also has the advantage that it should only
    john> collapse nearby ranges. So we don't end up putting the
    john> request at the beginning of the file into the same
    john> request at the end of the file. Instead we keep
    john> grouping the close-together changes.

Already the case.

    john> It is inefficient in that it keeps retrying the
    john> collapse function with larger fudge factors, but this
    john> code path should generally be triggered rarely anyway.

But unfortunately will loop endlessly.

    Vincent



More information about the bazaar mailing list