On Aug 12, 2008, at 9:13 AM, Deborah Goldsmith wrote:
"The Absolute Minimum Every Software Developer Absolutely, Positively Must Know About Unicode and Character Sets (No Excuses!)": http://www.joelonsoftware.com/articles/Unicode.htmlThat article is missing several concepts which are essential for understanding Unicode; like many programmers, Mr. Spolsky thinks of Unicode as "wide ASCII", which it is not.
From the article:
Some people are under the misconception that Unicode is simply a 16- bit code where each character takes 16 bits and therefore there are 65,536 possible characters. This is not, actually, correct. It is the single most common myth about Unicode, so if you thought that, don't feel bad.
It's worth mentioning that his article isn't meant to convey a complete understanding of Unicode; rather, it really is the absolute minimum every developer must know. Given your own earlier distinction between someone writing code that treats strings as a unit and someone writing code that looks through a string, I would argue that Spolsky's article is, indeed, a very appropriate primer for the *former* class of developer and certainly a more engaging read than many other treatises on character representations. I would also agree with your point that most people looking through strings are doing so without so much as a cursory understanding of some very important concepts.
-> jp
smime.p7s
Description: S/MIME cryptographic signature
_______________________________________________ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to [EMAIL PROTECTED]