On Tue, Nov 06, 2007 at 05:44:44AM +1100, skaller wrote: > > On Mon, 2007-11-05 at 10:20 -0800, Joe Buck wrote: > > On Mon, Nov 05, 2007 at 10:15:55AM -0800, Ian Lance Taylor wrote: > > > skaller <[EMAIL PROTECTED]> writes: > > > > Ah, I see. So turning [strict aliasing] off isn't really all that bad > > > > for optimisation. > > > > > > It depends on the processor. For an in-order processor with a deep > > > pipeline (e.g., Itanium, but there are others), the ability to reorder > > > loads and stores is very important to permit scheduling flexibility. > > > Strict aliasing reportedly makes a difference for x86, but I don't > > > think it makes a huge difference. > > > > It also depends on the input: for scientific codes that process both > > integer and real data, strict aliasing could make a large difference > > on any processor. > > Hmm .. the problem is that C is fairly brain dead, and people > regularly use casts: strict aliasing almost seems to break > a fundamental feature of the language.
Strict aliasing has been the rule since 1989 and is a fundamental feature. Many programmers believe that they can cast with abandon, but this is not the case. > [Hmm .. how strict is it? int* and unsigned* have to be aliased, > there's even a rule that guarantees that for the common subset > of values the representation is the same (that rule would be > useless if you couldn't alias an int and an unsigned!) This isn't the place to teach people the exact rule; it's in the standard. As you say, the unsigned and signed versions of integral types can alias.