On Jul 23, 4:06 am, Dave Methvin <[EMAIL PROTECTED]> wrote: > All of those results make sense. > > > "print( (new String('foo') === new String('foo')))" > > false > > Those are two different objects, even though they have the same value; > object1 !== object2 by definition of the === operator.
Agreed. > In real code, I have never seen a good reason to use 'new String()' > rather than a primitive string. There are are several situations where > 'new String()' will break code that seems like it should work > perfectly well--for example, in an eval() or switch statement. Agreed as well, but new String() does turn up unexpectedly from time to time. Try this code with jQuery 1.1.3.1: alert( $(mySelector).css( 'background-color', null ) ) That misbehaves and acts like a getter, and returns the color using (new String()). (i had to spend some time debuggering that problem in an application.) > "print('foo' === 'foo')" > true > > Two primitives with the same type and value are === as well as == i probably would expect it for string constants, but i would not expect it to be true for computed strings: [EMAIL PROTECTED]:~$ SpiderApe -e "a=function(){return 'abc'[0];}; print(a() === 'a')" true > > "print(typeof someUndefined == 'fuzzyDuck')" > > false > > typeof someUndefined is 'undefined', and 'undefined' != 'fuzzyDuck' This makes sense, of course, but seems to contradict the standard (as Rob described it above). > > "print(typeof someUndefined == 'undefined')" > > true > > typeof someUndefined is 'undefined' (string) so the two are equal. This is a case where i would not expected === to work because typeof computes a value and 'undefined' is a constant string literal that the JS compiler can intern/pool (whereas the 'undefined' returned from typeof is a native C string somewhere in the engine). > > "print(typeof someUndefined == undefined)" > > false > > The string 'undefined' is never equal to the intrinsic undefined > value. i find it unfortunate that typeof returns a string 'undefined' for the undefined case, similarly to how the sqlite3 C library has a function called sqlite3_error() which returns the last error string from the library and evaluates to the string "not an error" when there is no error (instead of using a null string, as would have been more sensible). Of course, no matter how unfortunate i find it, it's written in stone in the ECMA standard, so bitching won't do much about it. :)