https://gcc.gnu.org/bugzilla/show_bug.cgi?id=93582

--- Comment #15 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
With
struct S {
  unsigned int s1:1;
  unsigned int s2:1;
  unsigned int s3:1;
  unsigned int s4:1;
  unsigned int s5:4;
  unsigned char s6;
  unsigned short s7;
  unsigned short s8;
};
struct T {
  int t1;
  int t2;
};

static inline int
bar (struct S *x)
{
  if (x->s4)
    return ((struct T *)(x + 1))->t1 + ((struct T *)(x + 1))->t2;
  else
    return 0;
}

int
foo (int x, int y)
{
  struct S s;
  s.s6 = x;
  s.s7 = y & 0x1FFF;
  s.s4 = 0;
  return bar (&s);
}

static inline int
qux (struct S *x)
{
  int s4 = x->s4;
  if (s4)
    return ((struct T *)(x + 1))->t1 + ((struct T *)(x + 1))->t2;
  else
    return 0;
}

int
baz (int x, int y)
{
  struct S s;
  s.s6 = x;
  s.s7 = y & 0x1FFF;
  s.s4 = 0;
  return qux (&s);
}

we don't actually warn about the baz/qux case, the difference is just
separating in the source the bitfield load from comparison, so the premature
fold "optimization" doesn't trigger anymore.  And with -fno-tree-sra that s.s4
= 0; _10 = s.s4; is optimized during FRE1 value numbering.
The BIT_FIELD_REF also prevents SRA... :(
Anyway, it is clear why BIT_FIELD_REF <s, 8, 0> vn can't work in this case, the
other bits are uninitialized, or e.g. could be non-constant etc.
My thoughts were that we could just special-case BIT_FIELD_REF <s, N, M> & C,
either just where C is power of two and thus we extract a single bit and
handle that as visit_reference_op_load of BIT_FIELD_REF with that single bit,
so in this case BIT_FIELD_REF <s, 1, 3>, or perhaps iteratively, trying to
capture the bits one by one.
This still doesn't work, as vn_reference_lookup_3 punts on anything
non-byte-ish:
  /* 3) Assignment from a constant.  We can use folds native encode/interpret
     routines to extract the assigned bits.  */
  else if (known_eq (ref->size, maxsize)
           && is_gimple_reg_type (vr->type)
           && !contains_storage_order_barrier_p (vr->operands)
           && gimple_assign_single_p (def_stmt)
           && CHAR_BIT == 8 && BITS_PER_UNIT == 8
           /* native_encode and native_decode operate on arrays of bytes
              and so fundamentally need a compile-time size and offset.  */
           && maxsize.is_constant (&maxsizei)
           && maxsizei % BITS_PER_UNIT == 0                <============= This
fails, maxsizei is 1
           && offset.is_constant (&offseti)
           && offseti % BITS_PER_UNIT == 0                 <============= This
would fail too, offseti is 3
           && (is_gimple_min_invariant (gimple_assign_rhs1 (def_stmt))
               || (TREE_CODE (gimple_assign_rhs1 (def_stmt)) == SSA_NAME
                   && is_gimple_min_invariant (SSA_VAL (gimple_assign_rhs1
(def_stmt))))))
...
      if (base2
          && !reverse
          && !storage_order_barrier_p (lhs)
          && known_eq (maxsize2, size2)
          && multiple_p (size2, BITS_PER_UNIT)             <============= This
would fail too, size2 would be 1
          && multiple_p (offset2, BITS_PER_UNIT)           <============= This
would fail too, offset2 would be 3
          && adjust_offsets_for_equal_base_address (base, &offset,
                                                    base2, &offset2)
          && offset.is_constant (&offseti)
          && offset2.is_constant (&offset2i)
          && size2.is_constant (&size2i))
So, even if we don't want to handle all the bitfield vs. bitfield details for
now, could we perhaps special case the non-BITS_PER_UNITish offsets/sizes if
offseti == offset2i && maxsizei == size2i and perhaps don't use
native_encode/interpret in that case, but just handle INTEGER_CST and
fold_convert it to the right type?
These bitfields really ought to be integral types...

Reply via email to