"T Koster" wrote: > Currently, I am using system (os.system) to run wget. The mechanism is > in a loop, so that it will try all the mirrors while wget is exiting > with a non-zero exit status. This is working fine as long as the user > feels there is no need to interrupt it.
any reason you cannot use urllib2 (or urllib) with a socket timeout instead? something like: import socket, urllib list_of_mirrors = ... socket.setdefaulttimeout(10) for url in list_of_mirrors: try: f = urllib.urlopen(url) except IOError: print "trying another mirror" else: break # copy from the f stream to local file might work. </F> -- http://mail.python.org/mailman/listinfo/python-list