Andreas Bogk <[EMAIL PROTECTED]> writes: > > Making it defined and wrapping doesn't help at all. It just means you > > write different checks, not less of them. > > You have just seen somebody who can be considered an expert in matters > of writing C sofware come up with a check that looks correct, but is > broken under current gcc semantics. That should make you think.
I'm not entirely unsympathetic to your arguments, but, assuming you are referring to the code I wrote three messages up, this comment is unfair. The code I wrote was correct and unbroken. You suggest that it is broken because an attacker could take control of vp->len and set it to INT_MAX. But that just means that code in some other part of the program must check user input and prevent that from occurring. In fact, given the proposed attack of extract data from memory, INT_MAX is a red herring; any value larger than the actual memory buffer would suffice to read memory which should not be accessible. I think a better way to describe your argument is that the compiler can remove a redundant test which would otherwise be part of a defense in depth. That is true. The thing is, most people want the compiler to remove redundant comparisons; most people don't want their code to have defense in depth, they want it to have just one layer of defense, because that is what will run fastest. We gcc developers get many more bug reports about missed optimizations than we do about confusing language conformant code generation. One simple way to avoid problems in which the compiler removes redundant tests: compile without optimization. Another simple way: learn the language semantics and think about them. In any case, later today I hope to send out a patch for the -fstrict-overflow option. Ian