hello, I have a python3 script with urllib.request which have a strange behavior, here is the script :
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ #!/usr/bin/env python3 # -*- coding: utf-8 -*- import urllib.request import sys, time url = 'http://google.com' def make_some_stuff(page, url): sys.stderr.write(time.strftime("%d/%m/%Y %H:%M:%S -> page from \"") + url + "\"\n") sys.stderr.write(str(page) + "\"\n") return True def get_page(url): while 1: try: page = urllib.request.urlopen(url) yield page except urllib.error.URLError as e: sys.stderr.write(time.strftime("%d/%m/%Y %H:%M:%S -> impossible to access to \"") + url + "\"\n") time.sleep(5) continue def main(): print('in main') for page in get_page(url): make_some_stuff(page, url) time.sleep(5) if __name__ == '__main__': main() +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ if the computer is connected on internet (with an ethernet connection for example) and I run this script, it works like a charme : - urllib.request.urlopen return the page - make_some_stuff write in stderr - when the ethernet cable is unplug the except block handle the error while the cable is unplug, and when the cable is pluged back urllib.request.urlopen return the page and make_some_stuff write in stderr this is the normal behavior (for me, imho). but if the computer is not connected on internet (ethernet cable unpluged) and I run this script, the except block handle the error (normal), but when I plug the cable, the script continue looping and urllib.request.urlopen never return the page (so, it alway go to the except block) What can I do to handle that ? Thanks Steeve
-- http://mail.python.org/mailman/listinfo/python-list