Hi! This issue is latent on the trunk/6.2, fails with checking in 5.x/4.9. As usually in fold_*_loc, arg0/arg1 is STRIP_NOPS op0/op1, so might have incompatible type, so we need to convert it to the right type first before optimizing.
Bootstrapped/regtested on x86_64-linux/i686-linux, ok everywhere? 2016-06-29 Jakub Jelinek <ja...@redhat.com> PR middle-end/71693 * fold-const.c (fold_binary_loc) <case RROTATE_EXPR>: Cast TREE_OPERAND (arg0, 0) and TREE_OPERAND (arg0, 1) to type first when permuting bitwise operation with rotate. Cast TREE_OPERAND (arg0, 0) to type when cancelling two rotations. * gcc.c-torture/compile/pr71693.c: New test. --- gcc/fold-const.c.jj 2016-06-14 12:17:20.530282919 +0200 +++ gcc/fold-const.c 2016-06-29 16:39:15.175562715 +0200 @@ -10294,11 +10294,15 @@ fold_binary_loc (location_t loc, || TREE_CODE (arg0) == BIT_IOR_EXPR || TREE_CODE (arg0) == BIT_XOR_EXPR) && TREE_CODE (TREE_OPERAND (arg0, 1)) == INTEGER_CST) - return fold_build2_loc (loc, TREE_CODE (arg0), type, - fold_build2_loc (loc, code, type, - TREE_OPERAND (arg0, 0), arg1), - fold_build2_loc (loc, code, type, - TREE_OPERAND (arg0, 1), arg1)); + { + tree arg00 = fold_convert_loc (loc, type, TREE_OPERAND (arg0, 0)); + tree arg01 = fold_convert_loc (loc, type, TREE_OPERAND (arg0, 1)); + return fold_build2_loc (loc, TREE_CODE (arg0), type, + fold_build2_loc (loc, code, type, + arg00, arg1), + fold_build2_loc (loc, code, type, + arg01, arg1)); + } /* Two consecutive rotates adding up to the some integer multiple of the precision of the type can be ignored. */ @@ -10307,7 +10311,7 @@ fold_binary_loc (location_t loc, && TREE_CODE (TREE_OPERAND (arg0, 1)) == INTEGER_CST && wi::umod_trunc (wi::add (arg1, TREE_OPERAND (arg0, 1)), prec) == 0) - return TREE_OPERAND (arg0, 0); + return fold_convert_loc (loc, type, TREE_OPERAND (arg0, 0)); return NULL_TREE; --- gcc/testsuite/gcc.c-torture/compile/pr71693.c.jj 2016-06-29 16:39:15.175562715 +0200 +++ gcc/testsuite/gcc.c-torture/compile/pr71693.c 2016-06-29 16:39:15.175562715 +0200 @@ -0,0 +1,10 @@ +/* PR middle-end/71693 */ + +unsigned short v; + +void +foo (int x) +{ + v = ((((unsigned short) (0x0001 | (x & 0x0070) | 0x0100) & 0x00ffU) << 8) + | (((unsigned short) (0x0001 | (x & 0x0070) | 0x0100) >> 8) & 0x00ffU)); +} Jakub