On 15/03/2022 21:09, Martin Sebor wrote:
The strncmp function takes arrays as arguments (not necessarily
strings).  The main purpose of the -Wstringop-overread warning
for calls to it is to detect calls where one of the arrays is
not a nul-terminated string and the bound is larger than the size
of the array.  For example:

   char a[4], b[4];

   int f (void)
   {
     return strncmp (a, b, 8);   // -Wstringop-overread
   }

Such a call is suspect: if one of the arrays isn't nul-terminated
the call is undefined.  Otherwise, if both are nul-terminated there

Isn't "suspect" too harsh a description though? The bound does not specify the size of a or b, it specifies the maximum extent to which to compare a and b, the extent being any application-specific limit. In fact the limit could be the size of some arbitrary third buffer that the contents of a or b must be copied to, truncating to the bound.

I agree the call is undefined if one of the arrays is not nul-terminated and that's the thing; nothing about the bound is undefined in this context, it's the NUL termination that is key.

is no point in calling strncmp with a bound greater than their sizes.

There is, when the bound describes something else, e.g. the size of a third destination buffer into which one of the input buffers may get copied into. Or when the bound describes the maximum length of a set of strings where only a subset of the strings are reachable in the current function and ranger sees it, allowing us to reduce our input string size estimate. The bounds being the maximum of the lengths of two input strings is just one of many possibilities.

With no evidence that this warning is ever harmful I'd consider

There is, the false positives were seen in Fedora/RHEL builds.

suppressing it a regression.  Since the warning is a deliberate
feature in a released compiler and GCC is now in a regression
fixing stage, this patch is out of scope even if a case where
the warning wasn't helpful did turn up (none has been reported
so far).

Wait, I just reported an issue and it's across multiple packages in Fedora/RHEL :)

I think this is a regression since gcc 11 due to misunderstanding the specification and assuming too strong a relationship between the size argument of strncmp (and indeed strnlen and strndup) and the size of objects being passed to it. Compliant code relies on the compiler to do the right thing here, i.e. optimize the strncmp call to strcmp and not panic about the size argument being larger than the input buffer size. If at all such a diagnostic needs to stay, it ought to go into the analyzer, where such looser heuristic suggestions are more acceptable and sometimes even appreciated.

FWIW, I'm open to splitting the warning levels as you suggested if that's the consensus since it at least provides a way to make these warnings saner. However I still haven't found the rationale presented so far compelling enough to justify these false positives; I just don't see a proportional enough reward. Hopefully more people can chime in with their perspective on this.

Thanks,
Siddhesh

Reply via email to