STINNER Victor added the comment:

Some Google searchs told me that no CPU support 128-bit integer and
using 64-bit integer as intmax_t is safe.

GCC has a __int128 type, but I didn't find on which platform it is
supported, nor if intmax_t is __int128 in this case.

Microsoft Visual Studio has stdint.h since its version 2010 (which is
the version required to compile Python 3.4 on Windows according to the
Python developer guide).

I propose a safer definition of Py_intmax_t:

#ifdef HAVE_UINTMAX_T
typedef uintmax_t Py_uintmax_t;
typedef intmax_t Py_intmax_t;

#elif SIZEOF_SIZE_T >= 8
typedef size_t Py_uintmax_t;
typedef Py_ssize_t Py_intmax_t;

#elif defined(HAVE_LONG_LONG) && SIZEOF_LONG_LONG >= 8
typedef unsigned PY_LONG_LONG Py_uintmax_t;
typedef PY_LONG_LONG Py_intmax_t;

#else
#   error "Python needs a typedef for Py_uintmax_t in pyport.h."
#endif

I don't think that a fallback on the long type is required, testing
size_t should be enough.

At least, the compilation fails if the Py_intmax_t type cannot be defined.

Having generic PyLong_FromUintMax_t() and PyLong_AsUintMax_t()
functions (and signed version) would simplify the support of other OS
types with variable size: time_t, clock_t, pid_t, gid_t, uid_t, off_t,
etc.

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue17870>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to