On Wed, 26 Apr 2006, Robert Dewar wrote: > Bernd Trog wrote: > > can someone please explain the huge change in the internal > > integer representation(Uint) from -32769 to -32767? > > just a matter of efficiency for commonly used values
Does this mean that there are three different representation "ways" in this case? Is the handling of the value -32768 optimized in any way, while -32769 and -32767 are not optimized in the same way? > For interest, why do you ask? I'm chasing a bug that only appeares when Standard.Integer'Size is 16: http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26849 __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com