From: Andrew Pinski <apin...@marvell.com> The problem is the buffer is too small to hold "-O" and the interger. This fixes the problem by use the correct size instead.
Changes since v1: * v2: Use HOST_BITS_PER_LONG and just divide by 3 instead of 3.32. OK? Bootstrapped and tested on x86_64-linux with no regressions. gcc/c-family/ChangeLog: PR c/101453 * c-common.c (parse_optimize_options): Use the correct size for buffer. --- gcc/c-family/c-common.c | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/gcc/c-family/c-common.c b/gcc/c-family/c-common.c index 20ec263..e7a54a5 100644 --- a/gcc/c-family/c-common.c +++ b/gcc/c-family/c-common.c @@ -5799,7 +5799,7 @@ parse_optimize_options (tree args, bool attr_p) if (TREE_CODE (value) == INTEGER_CST) { - char buffer[20]; + char buffer[HOST_BITS_PER_LONG / 3 + 4]; sprintf (buffer, "-O%ld", (long) TREE_INT_CST_LOW (value)); vec_safe_push (optimize_args, ggc_strdup (buffer)); } -- 1.8.3.1