Qing Zhao <qing.z...@oracle.com> writes: >> On Jul 1, 2021, at 1:48 AM, Richard Biener <richard.guent...@gmail.com> >> wrote: >> >> On Wed, Jun 30, 2021 at 9:15 PM Qing Zhao via Gcc-patches >> <gcc-patches@gcc.gnu.org> wrote: >>> >>> >>> >>>> On Jun 30, 2021, at 1:59 PM, Richard Biener <rguent...@suse.de> wrote: >>>> >>>> On June 30, 2021 8:07:43 PM GMT+02:00, Qing Zhao <qing.z...@oracle.com> >>>> wrote: >>>>> >>>>> >>>>>> On Jun 30, 2021, at 12:36 PM, Richard Biener <rguent...@suse.de> >>>>> wrote: >>>>>> >>>>>> On June 30, 2021 7:20:18 PM GMT+02:00, Andrew Pinski >>>>> <pins...@gmail.com> wrote: >>>>>>> On Wed, Jun 30, 2021 at 8:47 AM Qing Zhao via Gcc-patches >>>>>>> <gcc-patches@gcc.gnu.org> wrote: >>>>>>>> >>>>>>>> I came up with a very simple testing case that can repeat the same >>>>>>> issue: >>>>>>>> >>>>>>>> [qinzhao@localhost gcc]$ cat t.c >>>>>>>> extern void bar (int); >>>>>>>> void foo (int a) >>>>>>>> { >>>>>>>> int i; >>>>>>>> for (i = 0; i < a; i++) { >>>>>>>> if (__extension__({int size2; >>>>>>>> size2 = 4; >>>>>>>> size2 > 5;})) >>>>>>>> bar (a); >>>>>>>> } >>>>>>>> } >>>>>>> >>>>>>> You should show the full dump, >>>>>>> What we have is the following: >>>>>>> >>>>>>> >>>>>>> >>>>>>> size2_3 = PHI <size2_1(D), size2_13> >>>>>>> <bb 3> : >>>>>>> >>>>>>> size2_12 = .DEFERRED_INIT (size2_3, 2); >>>>>>> size2_13 = 4; >>>>>>> >>>>>>> So CCP decides to propagate 4 into the PHI and then decides >>>>> size2_1(D) >>>>>>> is undefined so size2_3 is then considered 4 and propagates it into >>>>>>> the .DEFERRED_INIT. >>>>>> >>>>>> Which means the DEFERED_INIT is inserted at the wrong place. >>>>> >>>>> Then, where is the correct place for “.DEFERRED_INIT(size2,2)? >>>>> >>>>> The variable “size2” is a block scope variable which is declared inside >>>>> the “if” condition: >>>> >>>> But that's obviously not how it behaves >>>> During into SSA phase since we're inserting a PHI for it - and we're >>>> inserting it because of the use in the DEFERED_INIT call. I suppose you >>>> need to fiddle with the SSA rewrite and avoid treating the use as a use >>>> but only for the purpose of inserting PHIs... >>> >>> Please see my other email on the new small testing case without >>> -ftrivial-auto-var-init. The same issue in SSA with that testing case even >>> without -ftrivial-auto-var-init. >>> It looks like an existing bug to me in SSA. >> >> It's not a bug in SSA, it's at most a missed optimization there. > > I still think that SSA cannot handle block-scoped variable correctly in this > case, it should not move the variable out side of the block scope. -:) > >> But >> with -ftrivial-auto-var-init it >> becomes a correctness issue.
I might have lost track of what “it” is here. Do you mean the progation, or the fact that we have a PHI in the first place? For: unsigned int f (int n) { unsigned int res = 0; for (int i = 0; i < n; i += 1) { unsigned int foo; foo += 1; res = foo; } return res; } we generate: unsigned int foo; int i; unsigned int res; unsigned int _8; <bb 2> : res_4 = 0; i_5 = 0; goto <bb 4>; [INV] <bb 3> : foo_10 = foo_3 + 1; res_11 = foo_10; i_12 = i_2 + 1; <bb 4> : # res_1 = PHI <res_4(2), res_11(3)> # i_2 = PHI <i_5(2), i_12(3)> # foo_3 = PHI <foo_6(D)(2), foo_10(3)> if (i_2 < n_7(D)) goto <bb 3>; [INV] else goto <bb 5>; [INV] <bb 5> : _8 = res_1; return _8; IMO the foo_3 PHI makes no sense. foo doesn't live beyond its block, so there should be no loop-carried dependency here. So yeah, if “it” meant that the fact that variables live too long, then I agree that becomes a correctness issue with this feature, rather than just an odd quirk. Could we solve it by inserting a clobber at the end of the variable's scope, or something like that? >> I think the idea avoiding the USE of the variable in .DEFERRED_INIT >> and instead passing the init size is a good one and should avoid this >> case (hopefully). > > > Agreed, I have made such change yesterday, and this work around the current > issue. > > temp = .DEFERRED_INIT (temp/WITH_SIZE_EXPR(temp), init_type) > > To: > > temp = .DEFERRED_INIT (SIZE_of_temp, init_type) I think we're going round in circles here though. The point of having the undefined input was so that we would continue to warn about undefined inputs. The above issue doesn't seem like a good justification for dropping that. Thanks, Richard