Hi, On Mon, 16 May 2011, Richard Guenther wrote:
> > I think conversion _to_ BOOLEAN_TYPE shouldn't be useless, on the > > grounds that it requires booleanization (at least conceptually), i.e. > > conversion to a set of two values (no matter the precision or size) > > based on the outcome of comparing the RHS value with > > false_pre_image(TREE_TYPE(RHS)). > > > > Conversion _from_ BOOLEAN_TYPE can be regarded as useless, as the > > conversions from false or true into false_pre_image or true_pre_image > > always is simply an embedding of 0 or 1/-1 (depending on target type > > signedness). And if the BOOLEAN_TYPE and the LHS have same signedness > > the bit representation of boolean_true_type is (or should be) the same > > as the one converted to LHS (namely either 1 or -1). > > Sure, that would probably be enough to prevent non-BOOLEAN_TYPEs be used > where BOOLEAN_TYPE nodes were used before. It still will cause an > artificial conversion from a single-bit bitfield read to a bool. Not if you're special casing single-bit conversions (on the grounds that a booleanization from two-valued set to a different two-valued set of the same signedness will not actually require a comparison). I think it's better to be very precise in our base predicates than to add various hacks over the place to care for imprecision. Ciao, Michael.