[Python-Dev] Re: Version 2 of PEP 670 – Convert macros to functions in the Python C API
On 23. 02. 22 20:15, Victor Stinner wrote: On Wed, Feb 23, 2022 at 7:11 PM Petr Viktorin wrote: I did realize there's one more issue when converting macros or static inline functions to regular functions. Regular functions' bodies aren't guarded by limited API #ifdefs, so if they are part of the limited API it's easy to forget to think about it when changing them. If a macro in the limited API is converted to a regular function, then a test should be added to ensure the old implementation of the macro (i.e. what's compiled into stable ABI extensions) still works. Does it problem really belongs to PEP 670 "Convert macros to functions in the Python C API", or is it more something for PEP 652 "Maintaining the Stable ABI"? PEP 652 is a historical document for Python 3.10. Maybe this should go in the devguide, in a section on how to convert macros/static functions to regular functions? I don't think that Python 3.11 should keep a copy of Python 3.10 macros: it would increase the maintenance burden, each function would have 2 implementations (3.11 function and 3.10 macro). Also, there would be no warranty that the copied 3.10 macros would remain exactly the same than 3.10 code if someone changes them by mistake directly or indirectly (by changing code used by this macro, changing a compiler flag, etc). Maybe such stable ABI test belongs to an external project building a C extension with the Python 3.10 limited C API (or an older version) and then test it on Python 3.11. IMO it's the reliable way to test the stable ABI: a functional test. Maybe. But before we have that kind of test infrastructure, I'm worried that converting limited API macros to regular functions will make it harder to keep the stable ABI working. ___ Python-Dev mailing list -- [email protected] To unsubscribe send an email to [email protected] https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/[email protected]/message/T3HXC3GDPMJA4CONGMLWYNBREIUORMZT/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-Dev] Re: Require a C compiler supporting C99 to build Python 3.11
Ok, let me try something simpler: "Python 3.11 and newer versions use C11 without optional features. The public C API should be compatible with C++." https://github.com/python/peps/pull/2309/files Victor ___ Python-Dev mailing list -- [email protected] To unsubscribe send an email to [email protected] https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/[email protected]/message/C32XCK5WHMLJNV6RGLMN2XNBDWR4DI3V/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-Dev] Embedding multiple Python runtimes in the same process on Windows
Hi all, This is specifically about embedding Python on Windows, and I'm hoping some of the Windows Python devs might have some ideas or be interested in this. I have implemented a partial solution (linked below) and I'm interested to hear what other people think of this. Currently when embedding Python both python3x.dll and python3.dll get loaded, with python3.dll redirecting stable API calls to python3x.dll. If a process embeds two different versions of Python 3, e.g., python39.dll and python310.dll they both load their respective python3.dlls. At this point I imagine you are thinking "don't do that", but bear with me... The problem comes when the Python loaded second imports an extension module linked with python3.dll. The python3.dll that it gets linked to will be the first one that was loaded, not the one that relates to the second Python that is actually doing the import. This results in a call into the wrong Python dll and 'bad things' happen. None of this is unexpected and I'm sure that the sensible thing to do is to simply not do this... but I've been working on a way to make this work anyway. In my case I have a plugin to another application that embeds Python into that application. It's perfectly possible (and reasonable) for other plugins to also want to embed Python. This can be dealt with by having both plugins use the same Python environment easily enough in most cases. With Python 2 it used to be possible to have two different Python interpreters embedded at the same time and not have them interfere with each other (although it is possible there would still be issues with DLL versions used by extension modules). With Python 3 if we want to have two different versions of Python 3 embedded at the same time then it will fail because of the reason outlined above. My idea is to redirect all loaded python3.dll dlls to the one we want to be used before loading any extension modules (i.e., just before any call to LoadLibraryEx) and then restore them afterwards. This can be done by manipulating the loader modules list in Windows. I have implemented this as a proof of concept in my own plugin and confirmed that this works and does allow two different versions of Python 3 to be embedded at the same time. This works with an unmodified version of Python by applying the redirect in an import hook. It uses several undocumented Windows structures and APIs in order to safely manipulate the loader table. Here is my proof of concept code that performs the redirect https://gist.github.com/tonyroberts/74888762f0063238d4f7fd7c7d36f0f0 While this works for different versions of Python 3, there is still a problem when trying to embed two different instances of the same version of Python 3. The problem is basically the same but with the added complication that the pyd files are named the same, and the ones loaded first get found by the second Python runtime and you end up again calling across versions. I managed to solve this using a similar method to the code above, but rather than redirecting just python3.dll I look for any other loaded python3x.dll and then remove *all* modules loaded under that Python distribution from the loader table. This ensures that both Python runtimes are effectively isolated from each other as neither see any of the same modules. This gets more complicated once you start thinking about user site-packages folders and venvs, but for simple distributions where everything is under the same root folder this technique works. Anyway, just keen to hear what people think or whether this has been tackled before in another way. Best regards, Tony ___ Python-Dev mailing list -- [email protected] To unsubscribe send an email to [email protected] https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/[email protected]/message/IRO5XEMQPY7KEJJH5LBSMOCCL2ZKTT77/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-Dev] Re: Require a C compiler supporting C99 to build Python 3.11
> On 24 Feb 2022, at 11:45, Victor Stinner wrote: > > Ok, let me try something simpler: > > "Python 3.11 and newer versions use C11 without optional features. The > public C API should be compatible with C++." > https://github.com/python/peps/pull/2309/files Should is often read as meaning optional when writing specs. Can you say “must be compatible with C++”. Barry > > Victor > ___ > Python-Dev mailing list -- [email protected] > To unsubscribe send an email to [email protected] > https://mail.python.org/mailman3/lists/python-dev.python.org/ > Message archived at > https://mail.python.org/archives/list/[email protected]/message/C32XCK5WHMLJNV6RGLMN2XNBDWR4DI3V/ > Code of Conduct: http://python.org/psf/codeofconduct/ > ___ Python-Dev mailing list -- [email protected] To unsubscribe send an email to [email protected] https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/[email protected]/message/AKW6Y5TIHWIZKCSQD4I6GD4Q7GZQYUZ7/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-Dev] Re: Require a C compiler supporting C99 to build Python 3.11
On Thu, Feb 24, 2022 at 11:10 PM Barry wrote: > > "Python 3.11 and newer versions use C11 without optional features. The > > public C API should be compatible with C++." > > https://github.com/python/peps/pull/2309/files > > Should is often read as meaning optional when writing specs. > Can you say “must be compatible with C++”. I plan to attempt to write an actual test for that, rather than a vague sentence in a PEP. For now, "should" is a deliberate choice: I don't know exactly which C++ version should be targeted and if it's really an issue or not. For example, C++20 reserves the "module" keyword, whereas Python uses it in its C API. Example: PyAPI_FUNC(int) PyModule_AddType(PyObject *module, PyTypeObject *type); See: * https://bugs.python.org/issue39355 * https://github.com/pythoncapi/pythoncapi_compat/issues/21 -- I made a change in the datatable project to add Python 3.11 support using the pythoncapi_compat.h header file. Problem: this *C* header file produced new warnings in datatable extension module which with built with a C++ compiler: https://github.com/h2oai/datatable/pull/3231#issuecomment-1032864790 Examples: | src/core/lib/pythoncapi_compat.h:272:52: warning: zero as null pointer constant [-Wzero-as-null-pointer-constant] ||| tstate->c_profilefunc != NULL); |^~~~ |nullptr and | src/core/lib/pythoncapi_compat.h:170:12: warning: use of old-style cast [-Wold-style-cast] | return (PyCodeObject *)_Py_StealRef(PyFrame_GetCode(frame)); |^ I made pythoncapi_compat.h compatible with C++ (fix C++ compiler warnings) by using nullptr and reinterpret_cast(EXPR) cast if the __cplusplus macro is defined, or NULL and ((TYPE)(EXPR)) cast otherwise. datatable also uses #include "Python.h". I don't know there were only C++ compiler warnings on "pythoncapi_compat.h". Maybe because datatable only uses static inline functions from "pythoncapi_compat.h", but it may also emit the same warnings if tomorrow some static inline functions of "Python.h" are used. For now, I prefer to put a reminder in PEP 7 that the "Python.h" C API is consumed by C++ projects. Victor -- Night gathers, and now my watch begins. It shall not end until my death. ___ Python-Dev mailing list -- [email protected] To unsubscribe send an email to [email protected] https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/[email protected]/message/RGNBM5CSUPBQSTZND4PHEV3WUEKS36TP/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-Dev] Re: Should we require IEEE 754 floating-point for CPython?
The consensus is to require IEEE 754 to build CPython, but not require it in the Python language specification. Updates (changed merged in bpo-46656): * Building Python 3.11 now requires a C11 compiler without optional C11 features. I wrote it in What's New in Python 3.11 and the PEP 7. * Building Python 3.11 now requires support for floating point Not-a-Number (NaN): remove the Py_NO_NAN macro. Victor ___ Python-Dev mailing list -- [email protected] To unsubscribe send an email to [email protected] https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/[email protected]/message/EP625OXOLAQ3DSGFWICYAGVKAPWDSF2V/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-Dev] Re: Require a C compiler supporting C99 to build Python 3.11
On Thu, Feb 24, 2022 at 3:27 PM Victor Stinner wrote: > On Thu, Feb 24, 2022 at 11:10 PM Barry wrote: > > > "Python 3.11 and newer versions use C11 without optional features. The > > > public C API should be compatible with C++." > > > https://github.com/python/peps/pull/2309/files > > > > Should is often read as meaning optional when writing specs. > > Can you say “must be compatible with C++”. > > I plan to attempt to write an actual test for that, rather than a > vague sentence in a PEP. For now, "should" is a deliberate choice: I > don't know exactly which C++ version should be targeted and if it's > really an issue or not. > Agreed. "should" is good because we're not even clear if we currently actually comply with C++ standards. i.e. https://bugs.python.org/issue40120 suggests we technically may not for C++ (it is not strictly a superset of C as we all like to pretend), though for practical purposes regardless of standards compilers tend to allow that. We're likely overspecifying in any document we create about what we require because the only definition any of us are actually capable of making for what we require is "does it compile with this compiler on this platform? If yes, then we appear to support it. can we guarantee that? only with buildbots or other CI" - We're generally not versed in specific language standards (aside from compiler folks, who is?), and compilers don't comply strictly with all the shapes of those anyways for either practical or hysterical reasons. So no matter what we claim to aspire to, reality is always murkier. A document about requirements is primarily useful to give guidance to what we expect to be aligned with and what is or isn't allowed to be used in new code. Our code itself always has the final say. -gps > For example, C++20 reserves the "module" keyword, whereas Python uses > it in its C API. Example: > > PyAPI_FUNC(int) PyModule_AddType(PyObject *module, PyTypeObject *type); > > See: > > * https://bugs.python.org/issue39355 > * https://github.com/pythoncapi/pythoncapi_compat/issues/21 > > -- > > I made a change in the datatable project to add Python 3.11 support > using the pythoncapi_compat.h header file. Problem: this *C* header > file produced new warnings in datatable extension module which with > built with a C++ compiler: > https://github.com/h2oai/datatable/pull/3231#issuecomment-1032864790 > > Examples: > > | src/core/lib/pythoncapi_compat.h:272:52: warning: zero as null > pointer constant [-Wzero-as-null-pointer-constant] > ||| tstate->c_profilefunc != NULL); > |^~~~ > |nullptr > > and > > | src/core/lib/pythoncapi_compat.h:170:12: warning: use of old-style > cast [-Wold-style-cast] > | return (PyCodeObject *)_Py_StealRef(PyFrame_GetCode(frame)); > |^ > > I made pythoncapi_compat.h compatible with C++ (fix C++ compiler > warnings) by using nullptr and reinterpret_cast(EXPR) cast if > the __cplusplus macro is defined, or NULL and ((TYPE)(EXPR)) cast > otherwise. > > datatable also uses #include "Python.h". I don't know there were only > C++ compiler warnings on "pythoncapi_compat.h". Maybe because > datatable only uses static inline functions from > "pythoncapi_compat.h", but it may also emit the same warnings if > tomorrow some static inline functions of "Python.h" are used. > > For now, I prefer to put a reminder in PEP 7 that the "Python.h" C API > is consumed by C++ projects. > > Victor > -- > Night gathers, and now my watch begins. It shall not end until my death. > ___ > Python-Dev mailing list -- [email protected] > To unsubscribe send an email to [email protected] > https://mail.python.org/mailman3/lists/python-dev.python.org/ > Message archived at > https://mail.python.org/archives/list/[email protected]/message/RGNBM5CSUPBQSTZND4PHEV3WUEKS36TP/ > Code of Conduct: http://python.org/psf/codeofconduct/ > ___ Python-Dev mailing list -- [email protected] To unsubscribe send an email to [email protected] https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/[email protected]/message/XUPAVKB7S2NCOGQY2JUMDBSTJADIOBPY/ Code of Conduct: http://python.org/psf/codeofconduct/
