On Jan 6, 5:36 am, Philip Semanchuk <phi...@semanchuk.com> wrote: > On Jan 5, 2010, at 11:26 PM, aditya shukla wrote: > > > Hello people, > > > I have 5 directories corresponding 5 different urls .I want to > > download > > images from those urls and place them in the respective > > directories.I have > > to extract the contents and download them simultaneously.I can > > extract the > > contents and do then one by one. My questions is for doing it > > simultaneously > > do I have to use threads? > > No. You could spawn 5 copies of wget (or curl or a Python program that > you've written). Whether or not that will perform better or be easier > to code, debug and maintain depends on the other aspects of your > program(s). > > bye > Philip
Yep, the more easier and straightforward the approach, the better: threads are always (programmers')-error-prone by nature. But my question would be: does it REALLY need to be simultaneously: the CPU/OS only has more overhead doing this in parallel with processess. Measuring sequential processing and then trying to optimize (e.g. for user response or whatever) would be my prefered way to go. Less=More. regards, Marco -- http://mail.python.org/mailman/listinfo/python-list