Antoine Pitrou <pit...@free.fr> added the comment:

The problem with a signed Py_UNICODE is implicit sign extension (rather than 
zero extension) in some conversions, for example from "char" or "unsigned char" 
to "Py_UNICODE". The effects could go anywhere from incorrect results to plain 
crashes. Not only in our code, but in C extensions relying on the unsignedness 
of Py_UNICODE.

Is there a way to enable those optimizations while keeping an unsigned 
Py_UNICODE type? It seems Py_UNICODE doesn't have to be typedef'ed to wchar_t, 
it can be defined to be an unsigned integer of the same width. Or would it 
break some part of the C standard?

----------
nosy: +lemburg, loewis, pitrou

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue8781>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to