[EMAIL PROTECTED] wrote: > Hello. > Though Python supports threading, I think it is limited to python code > - as soon as you issue a command that uses an external (C?) module, all > of your python threads hang until this command returns. > Is that true? > I'm using urllib2 to download many files, and I have a double problem: > 1. downloading all of them is painfully slow since it's serial - one > has to finish before the next request gets sent. > 2. my GUI becomes non responsive during the downloads - major problem! > > Is there any way to work around that? > I want to run multiple download streams, in parallel, and while keeping > my program responsive. > Are there alternative modules that I can use for that? > > Any ideas? > Thanks a lot! > Others have spoken to the specifics of threads, etc. What I wanted to ask was why you think that parallel downloads will be faster? Unless you are downloading from multiple and different sites and your Internet download bandwidth is much greater than any individual site you wish to download from, you won't speed things up. If you wish to download several files from the SAME site, I doubt that threading is going to help. Most likely you are limited by the upload bandwidth of that site. Now if the site you are downloading from has some throttling turned on to limit the speed of an individual download, you could gain some benefit. You can't push bits through the pipes faster than their upper bandwidth.
Just some thoughts to consider. -Larry Bates -- http://mail.python.org/mailman/listinfo/python-list