On Thu, May 7, 2009 at 9:29 AM, Aahz <a...@pythoncraft.com> wrote: > > Here's my download script to get you started figuring this out, it does > the wget in the background so that several downloads can run in parallel > from a single terminal window: > > #!/bin/bash > > echo "Downloading $1" > wget "$1" > /dev/null 2>&1 & > -- > Aahz (a...@pythoncraft.com) <*> http://www.pythoncraft.com/ > > "It is easier to optimize correct code than to correct optimized code." > --Bill Harlan > -- > http://mail.python.org/mailman/listinfo/python-list >
Aahz, Thanks for the reply, but unfortunately that script is going in the complete wrong direction. Firstly, downloading multiple files in tandem does not speed up the process as it merely cuts up the already limited bandwidth into even smaller pieces and delays every download in progress. It is much better to queue downloads to occur one-by-one. Secondly, that approach is based on bash rather than Python. I know I could use the `&` operator on a command line to background processes, but I would like to be able to have more control through Python and the use of the subprocess or threading modules. -- http://mail.python.org/mailman/listinfo/python-list