In my perfect programming world ...

I'd want all characters all the time for any place characters are displayed to 
be displayed and entered as unicode characters and represented as UTF8 bytes.

If the display version has "割劥" I'd want the language to recognize those as two 
characters and as 6 bytes. 

I want UTF8 instead of UTF16 because UTF8 is the same byte stream regardless of 
processor endian-ness and more importantly, the entire web uses UTF8.

Is this crazy talk or would this be your ideal programming system for unicode?

Kee Nethery
_______________________________________________
use-livecode mailing list
use-livecode@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-livecode

Reply via email to