https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91003

--- Comment #4 from Richard Biener <rguenth at gcc dot gnu.org> ---
On the RHS we have

 <vector_type 0x7ffff6a4c7e0
    type <boolean_type 0x7ffff6a4c540 public SI
        size <integer_cst 0x7ffff687ade0 constant 32>
        unit-size <integer_cst 0x7ffff687adf8 constant 4>
        align:32 warn_if_not_align:0 symtab:0 alias-set -1 canonical-type
0x7ffff6a4c540 precision:32 min <integer_cst 0x7ffff6a69408 -2147483648> max
<integer_cst 0x7ffff6a694c8 2147483647>>
    V4SI
    size <integer_cst 0x7ffff687abe8 type <integer_type 0x7ffff688f0a8
bitsizetype> constant 128>
    unit-size <integer_cst 0x7ffff687ac00 type <integer_type 0x7ffff688f000
sizetype> constant 16>
    align:128 warn_if_not_align:0 symtab:0 alias-set -1 canonical-type
0x7ffff6a4c7e0 nunits:4>

which is signed vs. the unsigned int on the LHS.  The RHSs were created
as invariants.

We run into the case where vect_get_vec_def_for_operand (non-SLP) for
external/constant defs uses the passed in vector type but vect_get_slp_defs
not having this available, creating the constant/invariant via
vect_get_constant_vectors which just guesses based on operand type.

Reply via email to