The following addresses a behavioral difference in vector type analysis for typedef vs. non-typedef. It doesn't fix the issue at hand but avoids a spurious difference in the dumps.
Bootstrapped and tested on x86_64-unknown-linux-gnu, pushed. PR tree-optimization/116081 * tree-vect-stmts.cc (vect_get_vector_types_for_stmt): Properly compare types. --- gcc/tree-vect-stmts.cc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/gcc/tree-vect-stmts.cc b/gcc/tree-vect-stmts.cc index a47482375c1..aa98599c1f5 100644 --- a/gcc/tree-vect-stmts.cc +++ b/gcc/tree-vect-stmts.cc @@ -14905,7 +14905,7 @@ vect_get_vector_types_for_stmt (vec_info *vinfo, stmt_vec_info stmt_info, vector size per vectorization). */ scalar_type = vect_get_smallest_scalar_type (stmt_info, TREE_TYPE (vectype)); - if (scalar_type != TREE_TYPE (vectype)) + if (!types_compatible_p (scalar_type, TREE_TYPE (vectype))) { if (dump_enabled_p ()) dump_printf_loc (MSG_NOTE, vect_location, -- 2.43.0