Daniel Stutzbach <dan...@stutzbachenterprises.com> added the comment:

Forcing the compile-time and link-time encodings to match is tricky.  The goal 
is:

- Allow Unicode-agnostic modules to link, regardless of unicode settings
- Cause a link failure if the unicode settings are mismatched for a module that 
pokes into PyUnicodeObject

All of the solutions I've come up with have trade-offs.  Here is one approach:

Expose PyUnicodeObject if and only if the extension #defines a special flag 
(PY_EXPOSE_UNICODE_OBJECT?) before including Python.h.

If an extension does NOT define the flag, then PyUnicodeObject will not be 
defined nor the accessor macros for accessing its fields.  All of the opaque 
and non-opaque functions are still defined; the module just can't poke directly 
into PyUnicodeObject.  Linking will succeed as long as the module avoids 
non-opaque functions.

If the flag IS defined, then PyUnicodeObject will be defined along with the 
accessor macros.  The unicodeobject.h header will also arrange to require an 
appropriate symbol so that linking will succeed only if the module is compiled 
with the same Unicode settings as Python.

The upside of this approach is that it works: Unicode-agnostic modules will 
link and all others will not.

The drawback is a few extension modules will need to have a #define before 
including Python.h.  Only modules that poke into PyUnicodeObject will be 
impacted.  No change will be required for modules that depend on the Unicode 
setting yet stick to functions (opaque or not).

Thoughts?

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue8654>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to