New submission from Jacques Grove <jacq...@tripitinc.com>: When doing a urllib2 fetch of a url that results in a redirect, the connection to the redirect does not pass along the timeout of the original url opener. The result is that the redirected url fetch (which is a new request) will get the default socket timeout, instead of the timeout that the user requested originally. This is obviously a bug.
So we have in urllib2.py in 2.6.1: def http_error_302(self, req, fp, code, msg, headers): ..... return self.parent.open(new) this should be: return self.parent.open(new, timeout=req.timeout) or something in that vein. Of course, to be 100% correct, you should probably keep track of how much time has elapsed since the original url fetch went out, and reduce the timeout based on this, but I'm not asking for miracles :-) Jacques ---------- components: Library (Lib) messages: 80787 nosy: jacques severity: normal status: open title: urllib2.py timeouts do not propagate across redirects for 2.6.1 (and 3.x?) type: behavior versions: Python 2.6 _______________________________________ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue5102> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com