https://gcc.gnu.org/bugzilla/show_bug.cgi?id=114009
Andrew Pinski <pinskia at gcc dot gnu.org> changed: What |Removed |Added ---------------------------------------------------------------------------- Status|UNCONFIRMED |NEW Last reconfirmed| |2024-02-21 Ever confirmed|0 |1 --- Comment #1 from Andrew Pinski <pinskia at gcc dot gnu.org> --- Confirmed. What is happening is: `(a/2)*2 == 0` gets optimized to `((unsigned)a) + 1 <= 2` and then we don't handle: `((unsigned)a) + 1 <= 2 ? (a/2) : 0` which is just 0. as a has the range of [-1,1] for the truth side and divide that by 2 is always 0.