Michael Matz <m...@suse.de> writes: > Hello, > > I haven't followed this thread too closely, in particular I haven't seen > why the current form of the .DEFERRED_INIT call was chosen or suggested, > but it triggered my "well, that's obviously wrong" gut feeling; so sorry > for stating something which might be obvious to thread participants here.
Well, it does feel like this is pressing the reset button on a thread whose message count is already in the high double figures. There's a risk that restarting the discussion now is going to repeat a lot of the same ground, probably at similar length. (But I realise that the sheer length of the thread is probably also the main reason for not having followed it closely. :-)) To recap, the purpose of the option is to do two things simultaneously: (1) make the object code behave “as if” automatic variables were initialised with a fill pattern, before any explicit initialisation (2) continue to treat uses of uninitialised automatic variables as (semantically) undefined behaviour When this was discussed on the clang list, (2) was a key requirement and was instrumental in getting the feature accepted. The option is specificially *not* trying to change language semantics or create a fork in the language. The option is both a security and a debugging feature. As a security feature, it would be legitimate to optimise: int foo; if (flag) foo = 1; x = foo; into: x = 1; since the behaviour is at least deterministic. But from a debugging QoI perspective, this should not happen. We should generate the same code as for: int foo = <fill-pattern>; if (flag) foo = 1; x = foo; However, because of (2), we should still generate a -Wmaybe-uninitialized warning for the “x = foo” assignment. This is not a combination we support at the moment, so something needs to give. The question of course is what. > Anyway: > > On Thu, 1 Jul 2021, Richard Sandiford via Gcc-patches wrote: > >> >> It's not a bug in SSA, it's at most a missed optimization there. >> > >> > I still think that SSA cannot handle block-scoped variable correctly >> > in this case, it should not move the variable out side of the block >> > scope. -:) >> > >> >> But with -ftrivial-auto-var-init it becomes a correctness issue. >> >> I might have lost track of what “it” is here. Do you mean the >> progation, or the fact that we have a PHI in the first place? >> >> For: >> >> unsigned int >> f (int n) >> { >> unsigned int res = 0; >> for (int i = 0; i < n; i += 1) >> { >> unsigned int foo; >> foo += 1; > > Note that here foo is used uninitialized. That is the thing from which > everything else follows. Right, that was the point. > Garbage in, garbage out. It makes not too much sense to argue that > the generated PHI node on this loop (generated because of the > exposed-upward use of foo) is wrong, or right, or should better be > something else. The input program was broken, so anything makes sense. Yeah, without the new option it's GIGO. But I think it's still possible to rank different forms of garbage output in terms of how self-consistent they are. IMO it makes no sense to say that “foo” is upwards-exposed to somewhere where “foo” doesn't exist. Using “foo_N(D)” doesn't have that problem, so I think it's possible to say that it's “better” garbage output. :-) And the point is that with the new option, this is no longer garbage input as far as SSA is concerned: carrying the old value of “foo” across iterations would not be implementing the option correctly. How SSA handles this becomes a correctness issue. > That is the same with Qings shorter testcase: > > 6 for (i = 0; i < a; i++) { > 7 if (__extension__({int size2; > 8 size2 = ART_INIT (size2, 2); > > Uninitialized use of size2 right there. And it's the same for the use of > .DEFERRED_INIT as the patch does: > > { > int size2; > size2 = .DEFERRED_INIT (size2, 2); > size2 = 4; > _1 = size2 > 5; > D.2240 = (int) _1; > } > > Argument of the pseudo-function-call to .DEFERRED_INIT uninitialized -> > boom. > > You can't solve any of this by fiddling with how SSA rewriting behaves at > a large scale. You need to change this and only this specific > interpretation of a use. Or preferrably not generate it to start with. > >> IMO the foo_3 PHI makes no sense. foo doesn't live beyond its block, >> so there should be no loop-carried dependency here. >> >> So yeah, if “it” meant that the fact that variables live too long, >> then I agree that becomes a correctness issue with this feature, >> rather than just an odd quirk. Could we solve it by inserting >> a clobber at the end of the variable's scope, or something like that? > > It would possibly make GCC not generate a PHI on this broken input, yes. > But why would that be an improvement? Hopefully the above answers this. >> > Agreed, I have made such change yesterday, and this work around the >> > current issue. >> > >> > temp = .DEFERRED_INIT (temp/WITH_SIZE_EXPR(temp), init_type) >> > >> > To: >> > >> > temp = .DEFERRED_INIT (SIZE_of_temp, init_type) >> >> I think we're going round in circles here though. The point of having >> the undefined input was so that we would continue to warn about undefined >> inputs. The above issue doesn't seem like a good justification for >> dropping that. > > If you want to rely on the undefined use for error reporting then you must > only generate an undefined use when there was one before, you can't just > insert new undefined uses. I don't see how it could be otherwise, as soon > as you introduce new undefined uses you can and will run into GCC making > use of the undefinedness, not just this particular issue with lifetime and > PHI nodes which might be "solved" by clobbers. > > I think it's a chicken egg problem: you can't add undefined uses, for > which you need to know if there was one, but the facility is supposed to > help detecting if there is one to start with. The idea is that .DEFERRED_INIT would naturally get optimised away by DCE/DSE if the variable is provably initialised before it's used. Thanks, Richard