https://gcc.gnu.org/bugzilla/show_bug.cgi?id=120048

--- Comment #8 from Andrew Macleod <amacleod at redhat dot com> ---
IKt seems the root problem is that the ipa_vr class doesn't support UNDEFINED
ranges, and in this case it has a range but when the bitmask is explicitly
applied, we recognize that the range is actualyl UNDEFINED and convert it... 
and then we trip over the UNDEFINED problem as it was unexpected.

Im going to look back at what happens when we always generate a mask and value
and keep the range properly normalized.

Aldy made it on-demand only because the cost was high at the time (5% VRP), but
we do occasionally keep tripping over out-of-sync things.. like this. There
(likely) wouldn't be a problem if the range was simply UNDEFINED at its
creation time rather than being in this intermediate state for "a while" and
then normalized to UNDEFINED when the bitmask is applied.  

Im curious what the cost would be now, and if it could be alleviated enough to
make it worthwhile. 

Another option is to add a type to UNDEFINED, which I have looked at before.
That has its own issues, but I will take another look in a bit.

Reply via email to