------- Comment #4 from dberlin at gcc dot gnu dot org  2007-11-22 04:48 -------
Subject: Re:  [4.3 Regression] SCCVN breaks gettext

On 22 Nov 2007 04:26:57 -0000, matz at gcc dot gnu dot org
<[EMAIL PROTECTED]> wrote:
>
>
> ------- Comment #2 from matz at gcc dot gnu dot org  2007-11-22 04:26 -------
> The problem starts already in the first iteration:
> Value numbering destptr_3 stmt = destptr_3 = PHI <dest_9(6), destptr_14(7)>
> Setting value number of destptr_3 to dest_9
>
> So, for now we assume dest_9 == destptr_3, quite okay, lets assume so.  Next
> statement:
> Value numbering destptr.2_15 stmt = destptr.2_15 = (int) destptr_3;
> Setting value number of destptr.2_15 to destptr.2_15
>
> Looks innocent, but what this actually does is entering the RHS
> ((int)destptr_3) into the unary hash-table, but with translated (!) ssa names,

Right, but this is the optimistic set of hash tables, so that is okay.
 At the end of SCC iteration, it is okay to keep optimistic
assumptions in the optimistic table, even if they turned out to be
wrong.  However, at the end of SCC iteration, SSA_VAL should always be
correct for everything.
> ergo it enters (int)dest_9 into the hashtable, as having destptr.2_15 as
> value.  I.e. (int)dest_9 == destptr.2_15.  From there on everything breaks
> apart, because nobody is ever removing this association from the hash-table.
>   In particular we still (wrongly) think that nitems_19 is zero.

I don't see where above it has set nitems_19 to zero.

> This can be worked around with also iterating until nothing changes with
> the new hash table (with valid_info).  That's obviously not what is wanted,

There should be no need, as the fixpoint iteration of the optimistc
table should eventually end up with the values you want to insert into
the valid table.
That's in fact, the whole point.

> so there has to be a way to either cleanup the hashtable after iterations
> (this also doesn't seem to be designed in this way),

Again, it's okay for the optimistic assumptions to remain in the
table, and in fact, is designed for it to happen.
The paper goes into why this is so.

> information into the hash tables which might become invalid in later
> iterations.

No, this is also okay.
Again, it is fine for the optimistic hashtable to have invalid info.
It is not okay for SSA_VAL to end up with invalid value numbers at the
end of iteration.
|
> version in it), but this canonicalization needs to happen when looking up
> the hash table, not when _inserting_ into it, as canonicalization is transient
> and changes from iteration to iteration.
>
Again, this isn't right.  The paper goes into detail as to why it is
okay for the optimistic talbe to behave this way, and why it is okay
to do algebraic simplification/etc on insert.

The real problem seems to me, at least unless you guys haven't pasted
that part of the trace, that nitems_19 isn't part of the SCC but
should be.  By the time iteration of the SCC finishes, we should have
discovered that nitems_19 does not have the value 0.

The one in a real compiler I have of SCCVN both do canonicalization on
insert, as does the original code from Rice's massively scalar
compiler (which is where the algorithm comes from).

Maybe we aren't traversing uses in function arguments during DFS walk?

Given the code

            size_t new_max  = nitems + len2;

            if (new_max != len2)
                break;
            dest = foo (new_max);

            destptr = dest;
            while (len2--)
                destptr++;

            nitems = destptr - dest;

the  SCC we get should consist of destptr, dest, nitems, len2, and new_max

I could see if we were not DFS walking the argument to foo for some
reason, we would never get new_max/nitems/len2 into the SCC.

--Dan


-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=34176

Reply via email to