From: Andrew Pinski <apin...@marvell.com>

Every base 10 digit will take use ~3.32 bits to represent. So for
a 64bit signed integer, it is 20 characters. The buffer was only
20 so it did not fit; add in the null character and "-O" part,
the buffer would be 3 bytes too small.

Instead of just increasing the size of the buffer, I decided to
calculate the size at compile time and use constexpr to get a
constant for the size.
Since GCC is written in C++11, using constexpr is the best way
to force the size calculated at compile time.

OK? Bootstrapped and tested on x86_64-linux with no regressions.

gcc/c-family/ChangeLog:

        PR c/101453
        * c-common.c (parse_optimize_options): Use the correct
        size for buffer.
---
 gcc/c-family/c-common.c | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/gcc/c-family/c-common.c b/gcc/c-family/c-common.c
index 20ec26317c5..4c5b75a9548 100644
--- a/gcc/c-family/c-common.c
+++ b/gcc/c-family/c-common.c
@@ -5799,7 +5799,9 @@ parse_optimize_options (tree args, bool attr_p)
 
       if (TREE_CODE (value) == INTEGER_CST)
        {
-         char buffer[20];
+         constexpr double log10 = 3.32;
+         constexpr int longdigits = ((int)((sizeof(long)*CHAR_BIT)/log10))+1;
+         char buffer[longdigits + 3];
          sprintf (buffer, "-O%ld", (long) TREE_INT_CST_LOW (value));
          vec_safe_push (optimize_args, ggc_strdup (buffer));
        }
-- 
2.27.0

Reply via email to