scary configure output noticed: checking for int64_t... int checking for uint64_t... unsigned int
config.h: /* Define to the type of a signed integer type of width exactly 64 bits if such a type exists and the standard includes do not define it. */ #define int64_t int Looks like a known/fixed regression in autoconf 2.64. bash-3.2$ gcc -v Using built-in specs. Target: alpha-dec-osf4.0g Configured with: /usr/users/m3build/src/gcc-4.2.4/configure -prefix=/usr/users/m3build -disable-bootstrap Thread model: posix gcc version 4.2.4 http://www.mail-archive.com/bug-autoconf@gnu.org/msg02512.html So, question is, what is wrong with the following seems-simple algorithms: 1) Find the types that exist, char, short, int, long, long long, their sizes, #include <limits.h>, multiply by CHAR_BIT? Or probably assume all but long long exist. Or error out if anything other than long long doesn't exist (or __int64 instead) Or: 2) #include <limits.h>, compare UNSIGNED_FOO_MAX for equality to 255, 65535, 4GB, etc.? Before attempting to check for equal with the maximum 64bit number, since preprocessor might reject such a large number you would: See if long long exists, if so, ok. See if unsigned long max > 4GB. If so, proceed to compare for equality with .. You also have to hunt around slightly for the name of "ULONGLONG_MAX". Granted. Don't want to depend on limits.h? Honestly, what I often do is just: typedef signed char INT8; typedef short INT16; typedef int INT32; #if defined(_MSC_VER) || defined(__DECC) typedef __int64 INT64; #else typedef long long INT64; #endif But I understand the problems with that. None of those typedefs are *guaranteed* to be correct. Thanks, - Jay