On 07/24/2015 04:17 PM, Patrick Palka wrote:
On Fri, Jul 24, 2015 at 5:55 PM, Manuel López-Ibáñez
<lopeziba...@gmail.com> wrote:
On 24 July 2015 at 21:30, Jeff Law <l...@redhat.com> wrote:
On 07/24/2015 07:45 AM, Manuel López-Ibáñez wrote:

On 23 July 2015 at 19:43, Jeff Law <l...@redhat.com> wrote:

Warning in the front-ends like this can generate false positives (such as
a
NULL return in an unreachable path and miss cases where the NULL has to
be
propagated into the return by later optimizations.


False positives (for the warning as proposed right now) would be
strange, since it would mean that a returns_nonnull function returns
an explicit NULL in some code-path that is not meant to be executed.
That sounds like a bug waiting to happen.

Depends on how you choose to look at things.  It's quite common via macros &
such to have unexecutable/unreachable paths.  Whether or not to warn about
something found on such a path is a matter of personal preference.

I think it is also a matter of the particular warning and on the
balance of true positives vs. false positives in typical code-bases.
In this case, returning NULL in any code-path from a returns_nonnull
function, even if the path is currently unreachable via some macro
configuration, seems a bad idea. Of course, I'm open to be proven
wrong :-)

Moreover, this warning works in exactly the same cases as
__attribute__((nonnull)) does for function arguments, and so far those
haven't been a source of false positives.

I'm sure I could write one ;-)  And given that a FE based version of this
will only catch explicit NULLs in argument lists, I consider it so weak as
to be virtually useless.

Well, it is catching exactly all the cases that you were testing for
in your original -Wnull-attribute patch ;-)

I'm very much in favour of this, but only for things that cannot
reliably be warned from the FE. Clang has shown that it is possible to
improve much more the accuracy of warnings in the FE and still compile
faster than GCC by performing some kind of fast CCP (and VRP?) in the
FE  (or make the CCP and VRP passes occur earlier and even without
optimization):

And my assertion is that for things like we're discussing, you need the
analysis & optimizations both to expose the non-trivial cases and prune away
those which are false positives.  I consider exposing the non-trivial cases
and pruning away false positives the most important aspect of this kind of
work.

Based on my experience, I'm not convinced that moving warnings to the
middle-end is a good idea. The middle-end does a very poor job
keeping sensible locations when doing transformations and it will not
only introduce false positives, it will also remove true positives.
The diagnostics often refer to the wrong variable or code that is not
what the user originally wrote, which makes very hard to understand
the problem. One only has to read all the reports we have about
-Warray-bounds, -Wuninitialized, -Wstrict-overflow and other
middle-end warnings.

For example, Clang is currently able to warn about the following case
without any optimization, while GCC cannot at any optimization level:

int f(bool b) {
   int n;
   if (b)
     n = 1;
   return n;
}

Is there a PR for this particular test case?  I am interested in
improving the uninit analysis for gcc 6 so this potentially seems up
my alley.
To fix this you have to stop the reduction of degenerate PHIs when the RHS has a single real value and one or more undefined values. One of the advantages of the two pass scheme I suggested years ago is it would allow us to detect this before the optimizers collapsed the degenerate PHI.

jeff


Reply via email to