> Ian Lance Taylor writes:
>
> gcc assumes that if a variable is uninitialized, it can use any value
> whatsoever for any use of the variable.  gcc does not assume that all
> uses of an uninitialized variable must have the same value.
>
> It is of course possible to write a correct program which uses an
> uninitialized variable.  However, I believe that such a program must
> never examine the variable in any way whatsoever, so I don't agree
> that it is possible to detect the difference in values.  Even
> comparing the uninitialized variable to itself is undefined behaviour.
> Given that, I think that gcc's current approach will generate the most
> efficient code for correct programs which have uninitialized
> variables.

- I believe that it is a grave mistake to conclude that a well defined
  semantic operation on an indeterminate value, has undefined semantics.

  As this is simply an erroneous conclusion.

  As a simple example, although x may be indeterminate -1 < sin(x) < +1
  is unconditionally true, as must be tan(x) = sin(x)/cos(x), and x^x = 0;

  So although neither the compiler (nor program) may know what the value
  of x may be until run-time (unless it is chosen to be given a default
  initialized value of convenience by the compiler); regardless of that
  value, it must be utilized in a semantically consistent manor.

 (as a general rule, optimization should never alter observable semantics
  of a program within it's control; so if GCC chooses not to assign a
  default value to an otherwise un-initialized variable; it must assume
  it may be any single value, not any multiple value; although should
  feel free to complain loudly about it in either case.)


Reply via email to