On Sun, 10 Mar 2002 [EMAIL PROTECTED] wrote: > Lastly, the one not-completely-cool things I noticed about jigdo is that, > since many of the files are small and download relatively quickly (at > DSL speeds,) the single-ftp-logon-per-file nature of the way wget is > being used means you spend a lot of time connecting, logging in > anonymous, bringing up a tcp connection, etc. In short, my DSL modem is > being used at far less than it's maximum rate, so the process takes > longer than it needs to. I'm kind of kicking around the idea of > modifying it a bit to: > > 1) ask wget for more than one file at a time, to reduce the ftp login > overhead, and
I was under the impression that it already did this, getting 5 or 10 files at a time. I might have read something wrong though. > 2) allow the entry/selection of multiple mirrors, so that I can > distribute my load aross multiple mirrors. (I don't mind maxing out > *my* link, but It'd be nice if I could spread that load across a few > mirrors so as not to monopolize one mirror's bandwidth. (As a side > effect, this might also have the effect of reducing the probability > of the "Aargh!" problem, since, if a file failed at one mirror, I would > try another. > > Any suggestions on other things I could try to maximize download speeds > and/or improve mirror-friendliness/robustness are welcome. One could also recommend to use http instead of ftp. It is a much better protocol for just getting a file you know the name of, since you don't have to login. And at least for one mirror (the one I admin, ftp.se.debian.org) the http sever is faster and more light-weight than the ftp server. /Mattias Wadenstein -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]