https://gcc.gnu.org/bugzilla/show_bug.cgi?id=114009

--- Comment #3 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
That said, I fail to see why the a/2*2 in there matters.
a*!a is simply always 0 for integral types, both signed and unsigned, including
signed 1-bit precision.  If a is 0, the result is 0*1 (or for the last one
0*-1), if a is non-zero, then the result is a*0.

Reply via email to