On Thu, 2 Sep 2021, Jason Merrill wrote:

> On 8/31/21 3:15 PM, Patrick Palka wrote:
> > On Mon, 30 Aug 2021, Patrick Palka wrote:
> > 
> > > In the context of overload resolution we have the notion of a "bad"
> > > argument conversion, which is a conversion that "would be a permitted
> > > with a bending of the language standards", and we handle such bad
> > > conversions specially.  In particular, we rank a bad conversion as
> > > better than no conversion but worse than a good conversion, and a bad
> > > conversion doesn't necessarily make a candidate unviable.  With the
> > > flag -fpermissive, we permit the situation where overload resolution
> > > selects a candidate that contains a bad conversion (which we call a
> > > non-strictly viable candidate).  And without the flag we issue a
> > > distinct permerror in this situation instead.
> > > 
> > > One consequence of this defacto behavior is that in order to distinguish
> > > a non-strictly viable candidate from an unviable candidate, if we
> > > encounter a bad argument conversion during overload resolution we must
> > > keep converting subsequent arguments because a subsequent conversion
> > > could render the candidate unviable instead of just non-strictly viable.
> > > But checking subsequent arguments can force template instantiations and
> > > result in otherwise avoidable hard errors.  And in particular, all
> > > 'this' conversions are at worst bad, so this means the
> > > const/ref-qualifiers
> > > of a member function can't be used to prune a candidate quickly, which
> > > is the subject of the mentioned PR.
> > > 
> > > This patch tries to improve the situation without changing the defacto
> > > output of add_candidates.  Specifically, when considering a candidate
> > > during overload resolution this patch makes us shortcut argument
> > > conversion checking upon encountering the first bad conversion
> > > (tentatively marking the candidate as non-strictly viable, though it
> > > could ultimately be unviable) under the assumption that we'll eventually
> > > find a strictly viable candidate anyway (rendering the distinction
> > > between non-strictly viable and unviable moot, since both are worse
> > > than a strictly viable candidate).  If this assumption turns out to be
> > > false, we'll fully reconsider the candidate under the defacto behavior
> > > (without the shortcutting).
> > > 
> > > So in the best case (there's a strictly viable candidate), we avoid
> > > some argument conversions and/or template argument deduction that may
> > > cause a hard error.  In the worst case (there's no such candidate), we
> > > have to redundantly consider some candidates twice.  (In a previous
> > > version of the patch, to avoid this redundant checking I created a new
> > > "deferred" conversion type that represents a conversion that is yet to
> > > be performed, and instead of reconsidering a candidate I just realized
> > > its deferred conversions.  But it doesn't seem this redundancy is a
> > > significant performance issue to justify the added complexity of this
> > > other approach.)
> 
> OK, thanks.

Thanks a lot.  After retesting the patch, I ran into a libstdc++
testsuite failure involving reversed operator candidates that I somehow
missed earlier.  The failure is due to a bug in the last check below

      if (cand->viable == -1
          && shortcut_bad_convs
          && !cand->convs[cand->num_convs - 1]) // BUG
        {
          /* This candidate has been tentatively marked non-strictly viable,
             and we didn't compute all argument conversions for it (having
             stopped at the first bad conversion).  Add the function to BAD_FNS
             to fully reconsider later if we don't find any strictly viable
             candidates.  */
          bad_fns = lookup_add (fn, bad_fns);
          *candidates = (*candidates)->next;
        }

which assumes a shortcutted candidate must have a missing conversion
at the end of the conversion array (since argument conversions are
processed in left-to-right order).  But for a reversed candidate the
missing conversion would instead be at the front of the array, since we
reversed the order of its conversions in add_candidate.  So I changed
the problematic check to

  !cand->convs[cand->reversed () ? 0 : cand->num_convs - 1])

and committed the patch with this (presumably uncontroversial) change.

> 
> > > Lots of care was taken to preserve the defacto behavior w.r.t.
> > > non-strictly viable candidates, but I wonder how important this behavior
> > > is nowadays?  Can the notion of a non-strictly viable candidate be done
> > > away with, or is it here to stay?
> > 
> > To expand on this, as a concrete alternative to this optimistic shortcutting
> > trick we could maybe recognize non-strictly viable candidates only when
> > -fpermissive (and just mark them as unviable when not -fpermissive).  IIUC
> > this would be a backwards compatible change overall -- only diagnostics
> > would
> > be affected, probably for the better, since we'd explain the rejection
> > reason
> > for more candidates in the event of overload resolution failure.
> > 
> > Here's a testcase for which such a change would result in better
> > diagnostics:
> > 
> >    struct A {
> >      void f(int, int) const; // #1
> >      void f(int);            // #2
> >    };
> >       int main() {
> >      const A a;
> >      a.f(0);
> >    }
> > 
> > We currently consider #2 to be a better candidate than #1 because the
> > bad conversion of the 'this' argument makes it only non-strictly
> > viable, whereas #1 is considered unviable due to the arity mismatch.
> > So overload resolution selects #2 and we end up making no mention of #1
> > in the subsequent diagnostic:
> > 
> >    <stdin>: In function ‘int main()’:
> >    <stdin>:8:8: error: passing ‘const A’ as ‘this’ argument discards
> > qualifiers [-fpermissive]
> >    <stdin>:3:8: note:   in call to ‘void A::f(int)’
> > 
> > Better would be to explain why neither candidate is a match:
> > 
> >    <stdin>:8:6: error: no matching function for call to ‘A::f(int) const’
> >    <stdin>:2:8: note: candidate: ‘void A::f(int, int) const’
> >    <stdin>:2:8: note:   candidate expects 2 arguments, 1 provided
> >    <stdin>:3:8: note: candidate: ‘void A::f(int)’
> >    <stdin>:3:8: note:   passing ‘const A*’ as ‘this’ argument discards
> > qualifiers
> 
> Is that better?  Focusing diagnostics on the candidate you probably meant
> seems helpful in most cases.

Ah yeah, I see what you mean.

> 
> > 
> > Same for
> > 
> >    void f(int, int);
> >    void f(int*);
> >       int main() {
> >      f(42);
> >    }
> > 
> > for which we currently emit
> > 
> >    <stdin>: In function ‘int main()’:
> >    <stdin>:5:5: error: invalid conversion from ‘int’ to ‘int*’
> > [-fpermissive]
> >    <stdin>:2:8: note:   initializing argument 1 of ‘void f(int*)’
> > 
> > instead of
> > 
> >    <stdin>: In function ‘int main()’:
> >    <stdin>:5:4: error: no matching function for call to ‘f(int)’
> >    <stdin>:1:6: note: candidate: ‘void f(int, int)’
> >    <stdin>:1:6: note:   candidate expects 2 arguments, 1 provided
> >    <stdin>:2:6: note: candidate: ‘void f(int*)’
> >    <stdin>:2:6: note:   conversion of argument 1 would be ill-formed:
> >    <stdin>:5:5: error: invalid conversion from ‘int’ to ‘int*’
> > [-fpermissive]
> 
> Indeed, this one is less helpful; maybe it's time to remove the int<->pointer
> bad conversion.  In 2001 apparently it was still needed
> (https://gcc.gnu.org/pipermail/gcc-patches/2001-October/059124.html), but I
> would hope that the relevant code is dead by now.

Interesting, I'll look removing it and see what might break.

Reply via email to