Am 08.12.2007 um 02:49 schrieb Joseph S. Myers:
On Fri, 7 Dec 2007, Ross Ridge wrote:
Boris Boesler writes:
Ok, so what have I to do to write a back-end where all addresses are
given in bits? Memory is addressed in bits, not bytes. So I set:
#define BITS_PER_UNIT 1
#define UNITS_PER_WORD 32
I don't know if it's useful to define the size of a byte to be
less than
8-bits, even if that more accurately reflects the hardware.
Standard C
requires that the char type both be at least 8 bits (UCHAR_MAX >=
256)
and the same size as a byte (sizeof(char) == 1). You can't define
any
types that are smaller than a char and have sizeof work correctly.
I don't want to change sizes. It's addressing!
In theory GCC supports CHAR_TYPE_SIZE > BITS_PER_UNIT, so sizeof
(char) is
still 1 (sizeof counts in units of CHAR_TYPE_SIZE not
BITS_PER_UNIT) but a
char is not the hardware addressing unit. I expect this is even more
broken in practice than BITS_PER_UNIT > 8.
Hm, ok. So I patched some source code, one generated file and it
seems to work for int(eger) operations.
But if I want to add chars GCC runs into an endless loop during
conversion (its the functions convert and convert_to_integer). In
convert.c ~line 526 the parameters are: inprec:32 outprec:1 mode
bitsize:8 I'm wondering about the output precision "1". In tree.def
it is documented that a type precision is given in bits.
Any idea?
Boris