installing Ubuntu https PPA's with squid caching

Stuart McGraw smcg4191 at mtneva.com
Mon Nov 19 05:16:48 UTC 2018


TL;DR: How to set up apt and a squid proxy so that
https urls are cached?

I recently tried a novel (to me) way of installing
and configuring Ubuntu by scripting the install.
The idea is that the script(s) provide a record of
what and how things were installed, provide for
disaster recovery (I backup only user files), allow
for easier future reinstalls, and I can duplicate my
current configuration in a VM for testing new
software without risk of trashing my main machine.

Getting the scripts working (and maintaining changes
going forward) requires running them dozens of times.

But... I live in third-world America and have a slow
internet connection with a data cap.

The single thing that makes it practical to do was
setting up a Squid caching proxy on another local
machine and configuring Apt to use it when installing
in order not to download GBs of packages multiple
times.

The problem I am finding is that as I add new
software many PPA repos use https rather than
http for access and Squid doesn't seem cache these
packages.  I am also concerned that Ubuntu will
at some point switch to https which will, for me,
kill any possibility of using scripts.

Is there some way of setting up Apt and the squid
proxy so that it will cache https urls?

I currently set the following on the machine being
installed:
   # cat /etc/apt/apt.conf.d/20proxy
   Acquire::http::Proxy "http://srvr1.home:3128/";
   Acquire::https::Proxy "http://srvr1.home:3128/";

Googling has turned up a lot of info that was old,
not concerned with caching, expected to work with
every app and every user (I care only about apt
and root), complex/over-generalized (SSLbump), etc.

Does anyone have any suggestions for my particular
use case?




More information about the ubuntu-users mailing list