PYCURL considered unprogressive
Aaron Bentley
aaron.bentley at utoronto.ca
Mon Apr 10 13:56:02 BST 2006
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Sorry, I just had to :-)
One of my longtime niggles is that the inventory file download takes a
long time, and during that time, no progress is indicated. I thought
this ought to be easily fixed; our transport interface returns a file
object, and files can be read in pieces of fixed size. So we ought to
be able to read a bit, update a progress bar, read a bit more, etc.
This is the approach I started on here:
http://code.aaronbentley.com/bzr/bzrrepo/bzr.progress/
The Transport API implies that f = t.get() will be constant-time, and
f.read(x) will take time proportional to the size of x. The urllib
transport does behave this way. But this isn't guaranteed, and in fact,
it's not what the PYCURL transport does.
There appears to be no way to convince PYCURL to transfer data
iteratively: You simply call perform, and you are notified when the
transfer is complete. The only way to do things in chunks is to provide
a consumer as WRITEFUNCTION, and we can't implement a file-like object
as a consumer.
PYCURL does support a PROGRESSFUNCTION callback, so I think we'll be
able to get progress updates out of it. The problem is that we
shouldn't do progress updates in two places. So how do we know whether
to do them on read()? We can detect StringIO, I guess. It hardly seems
like a good approach.
Once again, I'm surprised there's no http library that does exactly what
we want. It's not exactly an obscure protocol!
Aaron
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.1 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org
iD8DBQFEOlXh0F+nu1YWqI0RAiqDAJ4/orWWjq+4ZuOZfT8K0IV5YPrUVQCePJd+
nIb0p3gfZfL97lk62/i+/yU=
=ViqN
-----END PGP SIGNATURE-----
More information about the bazaar
mailing list