https://gcc.gnu.org/bugzilla/show_bug.cgi?id=104854
--- Comment #8 from Siddhesh Poyarekar <siddhesh at gcc dot gnu.org> --- (In reply to Martin Sebor from comment #7) > Moving warnings into the analyzer and scaling it up to be able to run by > default, during development, sounds like a good long-term plan. Until that That's not quite what I'm suggesting here. I'm not a 100% convinced that those are the right heuristics at all; the size argument for strnlen, strndup and strncmp does not intend to describe the size of the passed strings. It is only recommended security practice that the *n* variant functions be used instead of their unconstrained relatives to mitigate overflows. In fact in more common cases the size argument (especially in case of strnlen and strncmp) may describe a completely different buffer or some other application-specific property. This is different from the -Wformat-overflow, where there is a clear relationship between buffer, the format string and the string representation of input numbers and we're only tweaking is the optimism level of the warnings. So it is not just a question of levels of verosity/paranoia. In that context, using size to describe the underlying buffer of the source only makes sense only for a subset of uses, making this heuristic quite noisy. So what I'm actually saying is: the heuristic is too noisy but if we insist on keeping it, it makes sense as an analyzer warning where the user *chooses* to look for pessimistic scenarios and is more tolerant of noisy heuristics. > happens, rather than gratuitously removing warnings that we've added over > the years, just because they fall short of the ideal 100% efficacy (as has > been known and documented), making them easier to control seems like a > better approach. It's not just a matter of efficacy here IMO. The heuristic for strnlen, strncmp and strndup overreads is too loose for it to be taken seriously.