> This has nothing to do with the amount of available RAM. The limitation is > with the virtual memory system. > If you can have 2^64 bytes of address space, you can a max store "2^64 / > sizeof(void *)" pointers.
You misunderstood, I wasn't talking about RAM. It was merely an illustration of how an assumption that seemed justified basing on some then current technical limits, in hindsight turned out ridiculous and crippling after the technical limits were pushed further. Never mind. > If you have sparse array, so just don't write it using NSNotFound. We are > talking about a well defined class and well defined API, not a potential non > existing API that can use the wrong constant. That's exactly the pitfall that I felt into, and that's the point of my original post. I thought it was safe to use NSNotFound with my classes because it is so commonly used with NSUInteger array indexes in the framework, but it turns out to be unsafe, and without any documented cautions. >> One higher bit is actually twice as many elements. Why having >> NSUInteger at all if you can't use more than NSIntegerMax? This >> doesn't seem right. > > Because you want to use an unsigned number to defined the number of element > (having a negative number of elements does not make sense), and it is > pointless and inefficient to to defined and used a 56bit integer type on > actual architectures. You seem to be ardently defending the idea that it's somehow okay to lose half of possible array indexes just because. I admit there may be good reasons for this, but I'm yet to see why exactly. _______________________________________________ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to arch...@mail-archive.com