On Dec 20, 2006, at 09:38, Bruno Haible wrote:
But the other way around? Without -fwrapv the compiler can assume more about the program being compiled (namely that signed integer overflows don't occur), and therefore has more freedom for optimizations. All optimizations that are possible with -fwrapv should also be performed without -fwrapv. Anything else is a missed optimization.
This is completely wrong. Making operations undefined is a two-edged sword. At the one hand, you can make more assumptions, but there's also the issue that when you want to rewrite expressions, you have to be more careful to not introduce undefined behavior when there was none before. The canonical example is addition of signed integers. This operation is associative with -wrapv, but not without. So a = b + C1 + c + C2; could be written as a = b + c + (C1 + C2); where the constant addition is performed at compile time. With signed addition overflowing, you can't do any reassociation, because this might introduce overflows where none existed before. Probably we would want to lower many expressions to unsigned eventually, but the question of when and where to do it emphasizes that you can only take advantage of undefined behavior if you make sure you don't introduce any. Sometimes I think it is far better to have a default of -fwrapv for at -O1 and possibly -Os. Sure, this would disable some powerful optimizations, especially those involving loops, but it would in practise be very useful to get reasonably good optimization for programs with minimizing the number of programs with undefined behavior. Also, it would allow some new optimizations, so total loss of performance may be quite acceptable. As -fwrapv only transforms programs with undefined behavior into programs with implementation-defined behavior, so nobody can possibly complain about their programs suddenly doing something different. Also, for safety-critical program and certification, it is essential to be able to reason about program behavior. Limiting the set of programs with erroneous or undefined execution is essential. If you want to prove that a program doesn't cause undefined behavior, it is very helpful signed integer overflow to be defined, even if it's just implementation defined. That would be a huge selling-point for GCC. -Geert