https://gcc.gnu.org/bugzilla/show_bug.cgi?id=112733
--- Comment #6 from Jakub Jelinek <jakub at gcc dot gnu.org> --- One thing is obviously we shouldn't crash on it and will debug that. But, what multiple_of_p does (or its callers) is weird: 14552 /* Check for special cases to see if top is defined as multiple 14553 of bottom: 14554 14555 top = (X & ~(bottom - 1) ; bottom is power of 2 14556 14557 or 14558 14559 Y = X % bottom 14560 top = X - Y. */ 14561 if (code == BIT_AND_EXPR 14562 && (op2 = gimple_assign_rhs2 (stmt)) != NULL_TREE 14563 && TREE_CODE (op2) == INTEGER_CST 14564 && integer_pow2p (bottom) 14565 && wi::multiple_of_p (wi::to_widest (op2), 14566 wi::to_widest (bottom), UNSIGNED)) 14567 return true; It uses UNSIGNED, but both op2 and bottom are signed: (gdb) p debug_tree (op2) <integer_cst 0x7fffea2e5e70 type <integer_type 0x7fffea14c2a0 signed char> constant 1> $17 = void (gdb) p debug_tree (bottom) <integer_cst 0x7fffea14e000 type <integer_type 0x7fffea14c2a0 signed char> constant -128> so it is in the end multiple_of_p 1, 0xffffff................fffffffffffffffffffffffff80 where the latter has 131072 bits precision. That definitely doesn't have anything to do with what the source does (and didn't even when the precision was just 576 bits before).