------- Comment #6 from pinskia at gcc dot gnu dot org 2007-03-12 22:38 -------
>From native_encode_int, we get:
(gdb) p/x *(unsigned int[2]*)ptr
$14 = {0xffff0f00, 0xffffffff}
Which is obviously wrong, it should have encoded it as:
{0x000fffff, 0xffffffff}
So we get the wrong answer to begin with and it is not an issue with
native_interpret_real, at least right at this moment.
--
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=30704
