In article <mailman.5307.1241805968.11746.python-l...@python.org>, The Music Guy <music...@alphaios.net> wrote: >On Thu, May 7, 2009 at 9:29 AM, Aahz <a...@pythoncraft.com> wrote: >> >> Here's my download script to get you started figuring this out, it does >> the wget in the background so that several downloads can run in parallel >> from a single terminal window: >> >> #!/bin/bash >> >> echo "Downloading $1" >> wget "$1" > /dev/null 2>&1 & > >Thanks for the reply, but unfortunately that script is going in the >complete wrong direction.
Not really; my point was that you could use something similar to process files after downloading. >Firstly, downloading multiple files in tandem does not speed up the >process as it merely cuts up the already limited bandwidth into even >smaller pieces and delays every download in progress. It is much >better to queue downloads to occur one-by-one. > >Secondly, that approach is based on bash rather than Python. I know I >could use the `&` operator on a command line to background processes, >but I would like to be able to have more control through Python and >the use of the subprocess or threading modules. Threading probably won't get you anywhere; I bet processing the files is CPU-intensive. I suggest looking into the multiprocessing module, which would then run the file processing. -- Aahz (a...@pythoncraft.com) <*> http://www.pythoncraft.com/ "It is easier to optimize correct code than to correct optimized code." --Bill Harlan -- http://mail.python.org/mailman/listinfo/python-list