Skip Montanaro wrote:
aahz> Here's the stark simple recipe: when you use Unicode, you *MUST*
aahz> switch to a Unicode-centric view of the universe. Therefore you
aahz> encode *FROM* Unicode and you decode *TO* Unicode. Period. It's
aahz> similar to the way floating point contaminates ints.
That's what I do in my code. Why do Unicode objects have a decode method
then?
Because MAL implemented it! >;->
It first encodes in the default encoding and then decodes the result
with the specified encoding, so if u is a unicode object
u.decode("utf-16")
is an abbreviation of
u.encode().decode("utf-16")
In the same way str has an encode method, so
s.encode("utf-16")
is an abbreviation of
s.decode().encode("utf-16")
Bye,
Walter Dörwald
--
http://mail.python.org/mailman/listinfo/python-list