On Wed, Nov 13, 2019 at 11:16:53AM -0500, Mark H Weaver wrote: > For these reasons, I'm inclined to think that parallel downloads is the > wrong approach. If a single download process is not making efficient > use of the available bandwidth, I'd be more inclined to look carefully > at why it's failing to do so. For example, I'm not sure if this is the > case (and don't have time to look right now), but if the current code > waits until a NAR has finished downloading before asking for the next > one, that's an issue that could be fixed by use of HTTP pipelining, > without multiplying the memory usage. > > What do you think?
I agree that parallel downloads is a kludge to work around the issue of slow set-up and tear-down of our download code. Pipelining would help a lot, and we could also profile the relevant Guile code to see if there are any easy speedups. This issue was actually discussed a year ago: https://lists.gnu.org/archive/html/guix-devel/2018-11/msg00148.html I'll quote Ludo's suggestion from then: > I’d be in favor of a solution where ‘guix substitute’ is kept alive > across substitutions (like what happens with ‘guix substitute --query’), > which would allow it to keep connections alive and thus save the TLS > handshake and a few extra round trips per download.