On Wed, Jan 29, 2003 at 10:23:26AM -0800, Michael Lazzaro wrote: > OK, I think we agree that 'default' refers to what to put in the > 'holes' of an array (or hash, but that's a separate discussion.) When > you overlay a real hash on top of your default values, the default > values "show through the holes". So now we just have to define what > "holes" are.
Holes are undefined things. > An assertion: The 'is default' property overrides the default 'empty' > value of the given underlying type. For a normal array of scalars, > that 'empty' value is C<undef>. But some scalar types can be set to > undef, and some can't: > > my @a; # 'empty' value of a scalar is undef > my int @a_int; # 'empty' value of an 'int' is 0 > my str @a_str; # 'empty' value of a 'str' is '' > > my Int @a_Int; # 'empty' value of an 'Int' is undef (right?) > my Str @a_Str; # 'empty' value of a 'Str' is undef (right?) > > So C<is default <def>> is defining the value to use as the 'empty > value' of the _underlying cell type_. I'd say that "undef" is the universal out-of-bounds value that can be applied to any type or aggregate to show the absense of value ("empty"). It's just that undef autovivifies to different things depending on how the thing was declared. > There are two credible choices, AFAICT: > > Solution 1: If you attempt to SET a cell to it's 'empty value', it > will be set to it's default: > > my int @a is default(5); # > @a[5] = 0; # actually sets it to it's 'empty value', > 5 > @a[5] = undef; # autocnv to 0, + warning, still sets to 5 > > my Int @a is default(5); # NOTE difference in type! > @a[5] = 0; # THIS really does set it to 0 > @a[5] = undef; # and this sets it to 5 > > So you can't set something to its type's own empty value, because it > will, by definition, thereafter return it's "overloaded" empty value, > <def>. Looks like a maintenance nightmare to me. If you always think of undef as the empty value, then @a[5] = 0 gives the sixth element in the array the value of 0 and @a[5] = undef gives the sixth element the undefined value (or the default value if defaulting applies). > Solution 2: _ANY_ other solution would require the introduction of > 'fake empty' and 'really empty', and require arrays to keep track of > the difference. > > my Int @a is default(5); > > @a[3] = 3; # there are now 4 items in the array > @a[2]; # was autoset to undef, so returns 5 > @a[4]; # doesn't exist, so returns 5 > > @a[2] = undef; # well, it's still undef, but now mark it > # as a 'real' undef, so don't return 5. Strange. I was thinking of the default value as what you get when you don't know what goes there (how do you know when you don't know? Easy: if it's undef, you don't know) my int @a is default(5); # "int" could be *any* type @a[3] = 3; print @a[2]; # prints 5, exists but undefined @a[3] = undef; print @a[3]; # again, prints 5 Why would you want to put a "real undef" in your array of default values? The whole point of defaulting is to change what "undef" means for the array/hash/whatever. MHO, -Scott -- Jonathan Scott Duff [EMAIL PROTECTED]