https://gcc.gnu.org/bugzilla/show_bug.cgi?id=117355

--- Comment #12 from Siddhesh Poyarekar <siddhesh at gcc dot gnu.org> ---
Even smaller reproducer that actually shows what's going on:

```
typedef unsigned long size_t;

#define STR "bbbbbbbbbbbbbbbbbbbbbbbbbbb"

void
gen_blr (void)
{
  char line[256];
  const char *p = STR;
  const char *q = p + sizeof (STR) - 1;

  char *q1 = line;
  for (const char *p1 = p; p1 < q;)
    {
      *q1++ = *p1++;

      if (p1 < q && (*q1++ = *p1++) != '\0')
        {
          if (__builtin_object_size (q1 - 2, 0) == 0)
            __builtin_abort ();
          if (__builtin_object_size (q1 - 2, 1) == 0)
            __builtin_abort ();
        }
    }
}

int
main ()
{
  gen_blr ();
}

```

The problem here is that the C++ frontend generates a dead branch because of
which early_objsz ends up overestimating the prospective size of q1.  That
combined with the negative offset results in the pass thinking that there's an
overflow and hence returns a zero size.

The actual fix should be to have the GIMPLE_PHI evaluation be such that it
returns the maximum size in the context of the expression it is part of, but
the static size computation logic doesn't really support so much state.  For
now I'm testing a simpler patch which falls back to wholesize in such cases:

```
index 09aad88498e..79e9f86fc89 100644
--- a/gcc/tree-object-size.cc
+++ b/gcc/tree-object-size.cc
@@ -1530,7 +1530,20 @@ plus_stmt_object_size (struct object_size_info *osi,
tree var, gimple *stmt)
               || bytes != wholesize
               || (size_valid_p (op1, object_size_type)
                   && compare_tree_int (op1, offset_limit) <= 0))
-       bytes = size_for_offset (bytes, op1, wholesize);
+       {
+         bytes = size_for_offset (bytes, op1, wholesize);
+
+         /* For static maximum size, where we have a distinct WHOLESIZE and a
+            negative offset, we may end up with a zero size because of too
+            large an estimate for OP0; a negative offset then ends up with a
+            potential underflow, which SIZE_FOR_OFFSET sees as zero available
+            size. Fall back to WHOLESIZE in such cases.  */
+         if (!(object_size_type & (OST_DYNAMIC | OST_MINIMUM))
+             && bytes == size_zero_node
+             && size_valid_p (op1, object_size_type)
+             && compare_tree_int (op1, offset_limit) > 0)
+           bytes = wholesize;
+       }
       /* In the static case, with a negative offset, the best estimate for
         minimum size is size_unknown but for maximum size, the wholesize is a
         better estimate than size_unknown.  */
```

Reply via email to