Jakub Jelinek <ja...@redhat.com> writes:
> On Tue, Jan 16, 2018 at 08:57:38AM +0100, Richard Biener wrote:
>> > -  unsigned HOST_WIDE_INT off
>> > -    = (tree_to_uhwi (DECL_FIELD_OFFSET (field))
>> > -       + tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field)) / BITS_PER_UNIT);
>> > -  if ((off % warn_if_not_align) != 0)
>> > -    warning (opt_w, "%q+D offset %wu in %qT isn't aligned to %u",
>> > +  tree off = byte_position (field);
>> > +  if (!multiple_of_p (TREE_TYPE (off), off, size_int (warn_if_not_align)))
>> 
>> multiple_of_p also returns 0 if it doesn't know (for the non-constant
>> case obviously), so the warning should say "may be not aligned"?  Or
>> we don't want any false positives which means multiple_of_p should get
>> a worker factored out that returns a tri-state value?
>
> tri-state sounds optimizing for the very uncommon case, I think it must be
> very rare in practice when we could prove it must be not aligned and
> especially we'd need to extend it a lot to handle those cases.
>
> Here is an updated patch which says may not be aligned if off is
> non-constant.  When extending the testcase, I've noticed we don't handle
> IMHO quite important case in multiple_of_p, so the patch handles that too.
> I've tried not to increase asymptotic complexity of multiple_of_p, so except
> for the cases where both arguments are INTEGER_CSTs it shouldn't call
> multiple_of_p more times than before.
>
> Ok for trunk if this passes bootstrap/regtest?
>
> 2018-01-16  Jakub Jelinek  <ja...@redhat.com>
>
>       PR c/83844
>       * stor-layout.c (handle_warn_if_not_align): Use byte_position and
>       multiple_of_p instead of unchecked tree_to_uhwi and UHWI check.
>       If off is not INTEGER_CST, issue a may not be aligned warning
>       rather than isn't aligned.  Use isn%'t rather than isn't.
>       * fold-const.c (multiple_of_p) <case BIT_AND_EXPR>: Don't fall through
>       into MULT_EXPR.
>       <case MULT_EXPR>: Improve the case when bottom and one of the
>       MULT_EXPR operands are INTEGER_CSTs and bottom is multiple of that
>       operand, in that case check if the other operand is multiple of
>       bottom divided by the INTEGER_CST operand.
>
>       * gcc.dg/pr83844.c: New test.
>
> --- gcc/stor-layout.c.jj      2018-01-15 22:40:14.009263280 +0100
> +++ gcc/stor-layout.c 2018-01-16 10:01:48.135111031 +0100
> @@ -1150,12 +1150,16 @@ handle_warn_if_not_align (tree field, un
>      warning (opt_w, "alignment %u of %qT is less than %u",
>            record_align, context, warn_if_not_align);
>  
> -  unsigned HOST_WIDE_INT off
> -    = (tree_to_uhwi (DECL_FIELD_OFFSET (field))
> -       + tree_to_uhwi (DECL_FIELD_BIT_OFFSET (field)) / BITS_PER_UNIT);
> -  if ((off % warn_if_not_align) != 0)
> -    warning (opt_w, "%q+D offset %wu in %qT isn't aligned to %u",
> -          field, off, context, warn_if_not_align);
> +  tree off = byte_position (field);
> +  if (!multiple_of_p (TREE_TYPE (off), off, size_int (warn_if_not_align)))
> +    {
> +      if (TREE_CODE (off) == INTEGER_CST)
> +     warning (opt_w, "%q+D offset %E in %qT isn%'t aligned to %u",
> +              field, off, context, warn_if_not_align);
> +      else
> +     warning (opt_w, "%q+D offset %E in %qT may not be aligned to %u",
> +              field, off, context, warn_if_not_align);
> +    }
>  }
>  
>  /* Called from place_field to handle unions.  */
> --- gcc/fold-const.c.jj       2018-01-15 10:02:04.119181355 +0100
> +++ gcc/fold-const.c  2018-01-16 10:48:10.444360796 +0100
> @@ -12595,9 +12595,34 @@ multiple_of_p (tree type, const_tree top
>        a multiple of BOTTOM then TOP is a multiple of BOTTOM.  */
>        if (!integer_pow2p (bottom))
>       return 0;
> -      /* FALLTHRU */
> +      return (multiple_of_p (type, TREE_OPERAND (top, 1), bottom)
> +           || multiple_of_p (type, TREE_OPERAND (top, 0), bottom));
>  
>      case MULT_EXPR:
> +      if (TREE_CODE (bottom) == INTEGER_CST)
> +     {
> +       op1 = TREE_OPERAND (top, 0);
> +       op2 = TREE_OPERAND (top, 1);
> +       if (TREE_CODE (op1) == INTEGER_CST)
> +         std::swap (op1, op2);
> +       if (TREE_CODE (op2) == INTEGER_CST)
> +         {
> +           if (multiple_of_p (type, op2, bottom))
> +             return 1;
> +           /* Handle multiple_of_p ((x * 2 + 2) * 4, 8).  */
> +           if (multiple_of_p (type, bottom, op2))
> +             {
> +               widest_int w = wi::sdiv_trunc (wi::to_widest (bottom),
> +                                              wi::to_widest (op2));
> +               if (wi::fits_to_tree_p (w, TREE_TYPE (bottom)))
> +                 {
> +                   op2 = wide_int_to_tree (TREE_TYPE (bottom), w);
> +                   return multiple_of_p (type, op1, op2);
> +                 }
> +             }

It doesn't really matter since this isn't performance-critical code,
but FWIW, there's a wi::multiple_of_p that would avoid the recursion
and do the sdiv_trunc as a side-effect.

Thanks,
Richard

Reply via email to