On Thu, Jan 25, 2018 at 3:45 PM, Jakub Jelinek <ja...@redhat.com> wrote: > On Fri, Jan 05, 2018 at 09:52:36AM +0100, Richard Biener wrote: >> On Wed, Jan 3, 2018 at 5:31 PM, Marek Polacek <pola...@redhat.com> wrote: >> > Here we are crashing because cxx_fold_indirect_ref got a POINTER_PLUS_EXPR >> > with offset > signed HOST_WIDE_INT and we tried to convert it to sHWI. >> > >> > The matching code in fold_indirect_ref_1 uses uHWIs so I've followed suit. >> > But that code now also uses poly_uint64 and I'm not sure if any of the >> > constexpr.c code should use it, too. But this patch fixes the ICE. >> >> POINTER_PLUS_EXPR offets are to be interpreted as signed (ptrdiff_t) >> so using uhwi and then performing an unsigned division is wrong code. >> See mem_ref_offset how to deal with this (ugh - poly-ints...). Basically >> you have to force the thing to signed. Like just use >> >> HOST_WIDE_INT offset = TREE_INT_CST_LOW (op01); > > Does it really matter here though? Any negative offsets there are UB, we > should punt on them rather than try to optimize them. > As we known op01 is unsigned, if we check that it fits into shwi_p, it means > it will be 0 to shwi max and then we can handle it in uhwi too.
Ah, of course. Didn't look up enough context to see what this code does in the end ;) > /* ((foo*)&vectorfoo)[1] => BIT_FIELD_REF<vectorfoo,...> */ > if (VECTOR_TYPE_P (op00type) > && (same_type_ignoring_top_level_qualifiers_p > - (type, TREE_TYPE (op00type)))) > + (type, TREE_TYPE (op00type))) > + && tree_fits_shwi_p (op01)) nevertheless this appearant "mismatch" deserves a comment (checking shwi and using uhwi). > { > - HOST_WIDE_INT offset = tree_to_shwi (op01); > + unsigned HOST_WIDE_INT offset = tree_to_uhwi (op01); > tree part_width = TYPE_SIZE (type); > - unsigned HOST_WIDE_INT part_widthi = tree_to_shwi > (part_width)/BITS_PER_UNIT; > + unsigned HOST_WIDE_INT part_widthi > + = tree_to_uhwi (part_width) / BITS_PER_UNIT; > unsigned HOST_WIDE_INT indexi = offset * BITS_PER_UNIT; > tree index = bitsize_int (indexi); > > if (known_lt (offset / part_widthi, > TYPE_VECTOR_SUBPARTS (op00type))) > return fold_build3_loc (loc, > BIT_FIELD_REF, type, op00, > part_width, index); > > } > > Jakub