Bruno Haible <[EMAIL PROTECTED]> writes: > -- Macro: AC_TYPE_LONG_LONG_INT > If the C compiler supports a working `long long int' type, define > `HAVE_LONG_LONG_INT'. > > Requiring at least 64 bits would be a change in semantics.
Yes. Whether this is a good thing depends on what one means by "working". For quite some time, the coreutils/gnulib long-long checker has rejected a 'long long' implementation that does not support long-long division or remainder (which are limitations of some pre-C99 implementations). So if we have code that relies on HAVE_LONG_LONG meaning "the type 'long long' exists", such code is already broken. > The difference matters: vasnprintf needs to understand the "ll" or "L" size > specifier, if the platform has a 'long long' type, regardless whether it > is 64-bit (ISO C 99 compliant) or not (likely only 32-bit). I must be missing something, since programs that don't think "long long" works shouldn't use the "ll" size specifier. And portable programs should use "L" only for floating-point conversion. Or is the problem that different parts of your program might disagree about whether 'long long' works? Surely the solution is to get on the same page with respect to whether 'long long' works, as this would in general be needed in in any event. > If your change doesn't introduce a bug in vasnprintf, as I can tell > after looking at the code for 10 minutes, it's because the way the > code is written and by luck. I hope our luck holds. But how serious is the issue? What platforms have 32-bit long long? Traditionally, 'long long' has always been at least 64 bits. That was the point of 'long long', and by now many applications must assume it, so I suspect that in practice we will solve more problems than we cure by requiring 'long long' to be 64 bits. For what it's worth, the gnulib macro was originally intended to check for 64-bittedness as well (hence the 63-bit shift). I think that check was inadvertently dropped when it was changed from a run-time to a link-time test.