Hi, I've tried urllib, requests, and some other options. But I haven't found a way to trap certain urls that aren't possible to connect from, outside the office. In those cases, I need to just output an error.
So, the following code will cover the 404s and similar errors for most of the problem IPs and urls. But for these particular problem urls, while debugging it keeps transferring control to adapters.py, and outputting requests.exceptions.ConnectionError: HTTPSConnectionPool(host='xxxxxxx.state.gov', port=443): Max retries exceeded with url: /xxx/xxx/xxx....(Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x03934538>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed')) How do I trap this error, without invoking adapters.py? I've tried many things, but the most recent is this code, from the web: try: r = requests.get(url,timeout=3) r.raise_for_status() except requests.exceptions.HTTPError as errh: print ("Http Error:",errh) except requests.exceptions.ConnectionError as errc: print ("Error Connecting:",errc) except requests.exceptions.Timeout as errt: print ("Timeout Error:",errt) except requests.exceptions.RequestException as err: print ("OOps: Something Else",err) It seems like the error should be trapped by the exception above requests.exceptions.ConnectionError, but instead its' hitting adapters.py. -- https://mail.python.org/mailman/listinfo/python-list