> From: Andrew Pinski <[EMAIL PROTECTED]> >>> No they should be using -ftrapv instead which traps on overflow and then >>> make sure they are not trapping when testing. >> >> - why? what language or who's code/target ever expects such a behavior? > Everyone's who writes C/C++ should know that overflow of signed is undefined. > > Now in Java it is defined, which is the reason why -fwrapv exists in the > place since GCC has a "Java" compiler. > > I think you need to go back in the archives and read the disscusions about > when -fwrapv was added and see why it is not turned on by default for C. > http://gcc.gnu.org/ml/gcc-patches/2003-05/msg00850.html > http://gcc.gnu.org/ml/gcc-patches/2003-03/msg02126.html > http://gcc.gnu.org/ml/gcc-patches/2003-03/msg01727.html
Thank again, upon fully reviewing the threads I still conclude: - C/C++ defines integer overflow as undefined because it's a target specific behavior, just as dereferencing a NULL is (although a large majority of targets factually do wrap overflow, and don't terminally trap NULL dereferences; so GCC's got it backwards in both cases). - So technically as such semantics are undefined, attempting to track and identify such ambiguities is helpful; however the compiler should always optimize based on the true semantics of the target, which is what the undefined semantics truly enable (as pretending a target's semantics are different than the optimization assumptions, or forcing post-fact run-time trapping semantics, are both useless and potentially worse, inefficient and/or erroneous otherwise).