> From: Michael Veksler <[EMAIL PROTECTED]> >> Paul Schlie <[EMAIL PROTECTED]> wrote on 20/06/2005 16:09:16: >>> From: Michael Veksler <[EMAIL PROTECTED]>> >>> As for overflow, you can say that you want instead of "undefined" >>> to treat is "unspecified". Where each architecture / opsys / compiler >>> must consistently define what happens on overflow: >>> - saturation >>> - wrap 2's (or 1's) complement >>> - exception. >> >> - yes, effectively I don't perceive any necessity for undefined, vs >> unspecified; as I don't perceive any necessity to give the compiler >> the freedom to treat generate an arbitrary program which may contain >> a potentially ambiguous specific and isolatable behavior. Again, is >> seems real simple to abide by C's sequence point and as-if rules to >> contain any ambiguity to the bounds between it's logical sequence points, >> and any resulting side-effects specific to that ambiguity must be expressed >> and logically bounded there. > > Look again at my dangling pointer example. In this example, the most > benign optimizations may "generate an arbitrary program" in this case. > As I said, and as Robert Dewar concurred, you can carefully define > something less strict than "undefined" on a case by case basis. > On the other hand, it is impossible to make all "undefined" cases > demonstrate an "isolatable behavior". Such a broad requirement is > impossible to fulfill, as my dangling pointer example shows. > > ... Dangling pointer may or may not lead to the corruption of the code > itself (self modifying code). When that inadvertently happens, all bets > are off. It is possible that by pure luck the code is "safe" without > optimization, and after optimizing unrelated stuff the code becomes self > destructive.
For what its worth, I don't consider this a problem, as long as the semantics (including the relative sequencing) of any dangling pointer are preserved in the process, regardless of the potentially unpredictable and/or dire consequences they may have. As it is just this uncertainty which I view as the basis of compiler's constraint; as unlike the apparently popular view that something which is un/ill-defined provides license to modify specified code in any way desired; I view the uncertainty as a constraint which forbids the compiler from presuming it's reference will store or return any predictable value, therefore can not be used as a basis of any optimization, therefore must preserve whatever unpredictable behavior may result upon it's execution within some potentially arbitrary environment, while continuing to presume that the remaining program semantics and state are preserved, therefore may continue to be optimized based upon any predictable side-effects which are not logically dependant on dangling pointer references. Thereby strictly preserving the program's semantics as authored, as potentially unpredictable and possibly unrepeatable as the resulting behavior may be; as I perceive to do otherwise as being inconsistent with the program as authored, although would expect a suitable warning alerting the programmer of it's likely unintended potential consequences if a dangling pointer references are identifiable; where if not, then all things remain the same, i.e. only known safe optimizations are enabled to be performed which by definition precludes any dependant on unpredictable value ranges and/or behaviors.