In article <[EMAIL PROTECTED]>,
 <[EMAIL PROTECTED]> wrote:
>
>BTW what's the difference between .encode and .decode ?
>(yes, I have been living in happy ASCII-land until now ... ;)

Here's the stark simple recipe: when you use Unicode, you *MUST* switch
to a Unicode-centric view of the universe.  Therefore you encode *FROM*
Unicode and you decode *TO* Unicode.  Period.  It's similar to the way
floating point contaminates ints.
-- 
Aahz ([EMAIL PROTECTED])           <*>         http://www.pythoncraft.com/

"19. A language that doesn't affect the way you think about programming,
is not worth knowing."  --Alan Perlis
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to