-------- Αρχικό μήνυμα --------
Θέμα: Re: UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb6 in position 0: invalid start byte
Ημερομηνία: Thu, 04 Jul 2013 14:34:42 +0100
Από: MRAB <pyt...@mrabarnett.plus.com>
Απάντηση: python-list@python.org
Προς: python-list@python.org
Ομάδες συζήτησης: comp.lang.python
Αναφορές: <kr3c7k$tjs$2...@news.grnet.gr> <n3tfaa-eqh....@satorlaser.homedns.org> <kr3jai$jn0$4...@news.grnet.gr> <mailman.4219.1372935984.3114.python-l...@python.org> <kr3mnq$jn0$6...@news.grnet.gr> <mailman.4222.1372939645.3114.python-l...@python.org> <kr3r7b$9h4$2...@news.grnet.gr>

On 04/07/2013 13:52, Νίκος wrote:
Στις 4/7/2013 3:07 μμ, ο/η MRAB έγραψε:
Also, try printing out ascii(os.environ['REMOTE_ADDR']).

'108.162.229.97' is the result of:

print( ascii(os.environ['REMOTE_ADDR']) )

Seems perfectly valid. and also have a PTR record, so that leaved us
clueless about the internal server error.

For me, socket.gethostbyaddr('108.162.229.97') raises socket.herror,
which is also a subclass of OSError from Python 3.3 onwards.

Tell me how i should write the try/except please.
--
What is now proved was at first only imagined!


--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to