On Mon, Jan 18, 2016 at 4:04 PM, Jason Merrill <ja...@redhat.com> wrote: > On 01/18/2016 02:12 PM, Patrick Palka wrote: >> >> On Mon, Jan 18, 2016 at 10:34 AM, Jason Merrill <ja...@redhat.com> wrote: >>> >>> On 12/25/2015 12:37 PM, Patrick Palka wrote: >>>> >>>> >>>> That alone would not be sufficient because more_specialized_fn() >>>> doesn't call maybe_adjust_types_for_deduction() beforehand, yet we >>>> have to do the decaying there too (and on both types, not just one of >>>> them). >>>> >>>> And maybe_adjust_types_for_deduction() seems to operate on the >>>> presumption that one type is the parameter type and one is the >>>> argument type. But in more_specialized_fn() and in get_bindings() we >>>> are really working with two parameter types and have to decay them >>>> both. So sometimes we have to decay one of the types that are >>>> eventually going to get passed to unify(), and other times we want to >>>> decay both types that are going to get passed to unify(). >>>> maybe_adjust_types_for_deduction() seems to only expect the former >>>> case. >>>> >>>> Finally, maybe_adjust_types_for_deduction() is not called when >>>> unifying a nested function declarator (because it is guarded by the >>>> subr flag in unify_one_argument), so doing it there we would also >>>> regress in the following test case: >>> >>> >>> >>> Ah, that makes sense. >>> >>> How about keeping the un-decayed type in the PARM_DECLs, so that we get >>> the >>> substitution failure in instantiate_template, but having the decayed type >>> in >>> the TYPE_ARG_TYPES, probably by doing the decay in grokparms, so it's >>> already decayed when we're doing unification? >> >> >> I just tried this, and it works well! With this approach, all but one >> of the test cases pass. The failing test case is unify17.C: >> >> -- 8< -- >> >> void foo (int *); >> >> template <typename T> >> void bar (void (T[5])); // { dg-error "array of 'void'" } >> >> void >> baz (void) >> { >> bar<void> (0); // { dg-error "no matching function" } >> } >> >> -- 8< -- >> >> Here, we don't get a substitution failure because we don't have a >> corresponding FUNCTION_DECL for the nested function specifier, only a >> FUNCTION_TYPE. So there is no PARM_DECL to recurse into during >> substitution, that retains the un-decayed argument type "T[5]" of the >> nested function specifier. > > > Then your original patch is OK. Thanks for indulging me.
I have committed the original patch, but I just noticed that it has the unintended effect of changing whether certain template declarations are considered duplicates or not. For instance, in the following test case we no longer consider the three declarations of "foo" to be duplicates and we instead register three overloads for foo, so that when we go to use foo later on we now get an ambiguity error: $ cat hmmmmm.cc template <typename T> int foo (T [4]); template <typename T> int foo (T [3]); template <typename T> int foo (T *); int x = foo<int> (0); $ g++ hmmmmm.cc hmmmmm.cc:10:20: error: call of overloaded ‘foo(int)’ is ambiguous int x = foo<int> (0); ^ hmmmmm.cc:2:5: note: candidate: int foo(T [4]) [with T = int] int foo (T [4]); ^~~ hmmmmm.cc:5:5: note: candidate: int foo(T [3]) [with T = int] int foo (T [3]); ^~~ hmmmmm.cc:8:5: note: candidate: int foo(T*) [with T = int] int foo (T *); ^~~ Before the patch, this test case would have compiled cleanly because each subsequent declaration of foo would have been considered a duplicate of the first one (since we would have decayed the array parameter types early on). I am not sure whether the previous or current behavior is correct, or if maybe the first two decls ought to be considered duplicates and the last one not. What do you think?