I think quite a lot of front end uses of BITS_PER_UNIT should really be 
TYPE_PRECISION (char_type_node) (which in general I'd consider preferred 
to CHAR_TYPE_SIZE in the front ends).  Though it's pretty poorly defined 
what datastructures should look like if target "char" in the front ends is 
wider than the instruction-set unit of BITS_PER_UNIT bits.

If something relates to an interface to a lower-level part of the compiler 
then BITS_PER_UNIT is probably right - but if somethis relates to whether 
a type is a variant of char, or to alignment of a non-bit-field object 
(you can't have smaller than char alignment), or things like that, then 
TYPE_PRECISION (char_type_node) may be better.

Note that BITS_PER_UNIT is used in code built for the target (libgcc2.c, 
dfp-bit.h, fixed-bit.h, fp-bit.h, libobjc/encoding.c, ...), and converting 
it to a hook requires eliminating those uses.  __CHAR_BIT__ is a suitable 
replacement, at least if the code really cares about char - which is the 
case whenever the value is multiplied by the result of "sizeof".  Some 
questions about machine modes might most usefully be answered by 
predefined macros giving properties of particular machine modes.

-- 
Joseph S. Myers
jos...@codesourcery.com

Reply via email to