On 10/8/18 6:24 PM, David Malcolm wrote:
On Mon, 2018-10-08 at 10:37 -0400, Jason Merrill wrote:
On Thu, Oct 4, 2018 at 10:12 AM David Malcolm <dmalc...@redhat.com>
wrote:

-Wformat in the C++ FE doesn't work as well as it could:
(a) it doesn't report precise locations within the string literal,
and
(b) it doesn't underline arguments for those arguments
!CAN_HAVE_LOCATION_P,
despite having location wrapper nodes.

For example:

   Wformat-ranges.C:32:10: warning: format '%s' expects argument of
type 'char*', but argument 2 has type 'int' [-Wformat=]
   32 |   printf("hello %s", 42);
      |          ^~~~~~~~~~

(a) is due to not wiring up the langhook for extracting substring
     locations.

     This patch uses the one in c-family; it also fixes string
literal
     parsing so that it records string concatenations (needed for
     extracting substring locations from concatenated strings).

(b) is due to the call to maybe_constant_value here:
        fargs[j] = maybe_constant_value (argarray[j]);
     within build_over_call.

Maybe we should remove that in favor of fold_for_warn in
check_function_arguments.

Jason

This patch eliminates the arglocs array I introduced to build_over_call
in r264887, and eliminates the call to maybe_constant_value when building
"fargs" (thus retaining location wrapper nodes).

Instead, this patch requires that any checks within
check_function_arguments that need folded arguments do their own folding.

Of the various checks:
(a) check_function_nonnull already calls fold_for_warn,
(b) check_function_format doesn't need folding
(c) check_function_sentinel needs fold_for_warn in one place, which the
patch adds, and
(d) check_function_restrict needs per-argument folding, which the patch
adds.  Given that it scans before and after resetting TREE_VISITED on
each argument, it seemed best to make a copy of the array, folding each
argument from the outset, rather than repeatedly calling fold_for_warn;

Successfully bootstrapped & regrtested on x86_64-pc-linux-gnu.

OK for trunk?

OK, thanks.

Jason

Reply via email to