On Dec 17, 2007 7:51 PM, Emmet Hikory <<a href="mailto:emmet.hikory@gmail.com">emmet.hikory@gmail.com</a>> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Bandwidth will always be location dependent, so package size<br>related to bandwidth is not an easy item for discussion. Smaller is<br>nice, but for some it doesn't matter very much, as their connections<br>are either fast enough that it's not important, or slow enough that
<br>they download overnight (or otherwise deeply backgrounded) anyway.<br><br> Processor speed is similarly variable. Not that benchmarks (1)<br>mean much, but the interesting point is only relative time for<br>decompression, as for most packages the IO load delay is concentrated
<br>on unpack rather than decompress (and further that packing happens<br>once per revision per architecture, and unpacking happens a lot). Is<br>3-5 times as long acceptable? Maybe. Is 10-15 times as long<br>acceptable? Probably not (so we shouldn't use bzip2 unless it gets a
<br>lot faster).</blockquote></div><br>If the decompress time isn't a huge factor to begin with and most of the install time is spent putting things in the right place during unpack, do we really need to worry too much about decompression time? We know for sure that in a lot of areas broadband isn't available or at least isn't practical. There are DVD sets that people can share to get everything from the repo, but those don't include security updates, so those still would be problematic for users. If we can make the updates' downloads smaller and faster, updates become more accessible to people in those low-bandwidth areas.
<br clear="all"><br>-- <br>Mackenzie Morgan<br>Linux User #432169<br>ACM Member #3445683<br><a href="http://ubuntulinuxtipstricks.blogspot.com">http://ubuntulinuxtipstricks.blogspot.com</a> <-my blog of Ubuntu stuff<br>
apt-get moo