On 08/24/2016 04:03 PM, Joseph Myers wrote:
On Wed, 24 Aug 2016, Martin Sebor wrote:
No. I recall having seen Glibc fail with ENOMEM years ago when
formatting a floating point number to a very large precision but
I haven't seen any implementation fail. I haven't yet looked to
see if the Glibc failure can still happen. My reading of C and
POSIX is that snprintf is only allowed to fail due to an encoding
error, not because it runs out of memory, so such a failure would
seem like a bug.
It's a general ISO C principle that there may be implementation limits,
including in areas where no specific minimum limit is given in the
standard, and even where there is a mimimum, that doesn't mean that every
possible program that does not exceed that minimum does not exceed the
limit in that area. In this case, a specific minimum limit is given,
7.21.6.1#15 "The number of characters that can be produced by any single
conversion shall be at least 4095.". That is, libc can't reject all cases
of larger results, but it's possible that if memory is short then some
cases with shorter results could still fail.
It sounds like using the 4095 limit should be safe even with Glibc.
I'll use it then, thanks.
It sounds like the concern is that for the following call (when
UCHAR_MAX is 255):
sprintf (d, "%hhu", 1000)
some implementation (an old version of Glibc?) may have actually
produced four digits and returned 4 on the basis of C saying that
the %hhu argument must be an unsigned char (promoted to int) and
thus the behavior of the call being undefined.
Yes.
Since it's presumably a Glibc bug (bug 2509?) is this something
you believe the optimization needs to worry about? If so, can
you confirm that only Glibc versions 2.4 and prior are affected
and 2.5 is not?
Martin