https://gcc.gnu.org/bugzilla/show_bug.cgi?id=116024

--- Comment #6 from Richard Biener <rguenth at gcc dot gnu.org> ---
(In reply to Artemiy Volkov from comment #5)
> Hi Andrew, thank you for the breakdown.  For i1() (the case applicable to
> the initial bug report) something like this seems to fix the issue:
> 
> diff --git a/gcc/match.pd b/gcc/match.pd
> index cf359b0ec0f..8ab6d47e278 100644
> --- a/gcc/match.pd
> +++ b/gcc/match.pd
> @@ -8773,2 +8773,10 @@ DEFINE_INT_AND_FLOAT_ROUND_FN (RINT)
>  
> +/* Transform comparisons of the form C1 - X CMP C2 to X - C1 CMP -C2.  */
> +(for cmp (lt le gt ge eq ne)
> +     rcmp (gt ge lt le eq ne)
> +  (simplify
> +   (cmp (minus INTEGER_CST@0 @1) INTEGER_CST@2)
> +   (if (TYPE_OVERFLOW_UNDEFINED (TREE_TYPE (@1)))
> +     (rcmp (minus @1 @0) (negate @2)))))
> +
>  /* Canonicalizations of BIT_FIELD_REFs.  */
> 
> Would it make sense for this ticket to be assigned to me so I could refine
> and post the above patch as well as tackle i2() and i3() (should those be
> extracted to a separate PR or is it fine to fix all three under this PR)?

I don't think this is correct for types with undefined behavior on overflow
because you can't negate INT_MIN.

Reply via email to