Re: Compiling and Linking pre-built Windows Python libraries with C++ files on Linux for Windows

2022-03-19 Thread Dan Stromberg
On Fri, Mar 18, 2022 at 8:03 PM Ankit Agarwal  wrote:

> Hi,
>
> This is a very specific question. I am trying to figure out whether or not
> I can use pre-built python libraries and headers on Windows in a MinGW
> build on Linux. Essentially I have some python and C++ code which interface
> via cython and pybind. I want to build a self contained C++ binary for
> Windows with the MinGW compiler that runs on Linux. For both Cython and
> PyBind, they need to compile with the python headers and link against the
> python DLL in order to run on Windows.
>
> I know that the python DLL specifically are compiled with the MSVC
> compiler, however since it is in C, the ABI between the DLL should be
> compatible with MinGW, and I should be able to import and link against it.
> My question is will this work, or will there be some other problem that I
> might run into.
>
>
I haven't tried this.

However, I used to cross-compile the Linux kernel from Solaris on Sparc to
Intel.  I just had to:
1) Get the relevant headers and libraries on Solaris
2) Deal with the byte sex issues - Sparc is Big Endian, Intel is Little
Endian
3) Natively compile a little bit of code that was needed by the build
process

You appear to be aware of #1.

You probably won't need to worry about #2, since you're going Intel ->
Intel.

#3 could be an issue for you, but it's just a matter of using two different
compilers for some different parts of the build process - one native, one
cross.

I'd try a little hello world first, then worry about your larger project.

You could also put some feelers out about Cython and Pybind too, to see if
they've been used for cross-compilation before.  If yes, you're probably in
like Flynn, otherwise it could potentially turn out to be a big project.

If cross-compilation doesn't work out, you could probably set up a Windows
virtual machine with an sshd, and build on that.

Either way, you may find Wine useful for testing.

HTH.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Compiling and Linking pre-built Windows Python libraries with C++ files on Linux for Windows

2022-03-19 Thread Eryk Sun
On 3/18/22, Ankit Agarwal  wrote:
> Hi,
>
> This is a very specific question. I am trying to figure out whether or not
> I can use pre-built python libraries and headers on Windows in a MinGW
> build on Linux. Essentially I have some python and C++ code which interface
> via cython and pybind. I want to build a self contained C++ binary for
> Windows with the MinGW compiler that runs on Linux. For both Cython and
> PyBind, they need to compile with the python headers and link against the
> python DLL in order to run on Windows.
>
> I know that the python DLL specifically are compiled with the MSVC
> compiler, however since it is in C, the ABI between the DLL should be
> compatible with MinGW, and I should be able to import and link against it.
> My question is will this work, or will there be some other problem that I
> might run into.

MinGW used to link with msvcrt (the private CRT for system components)
instead of ucrt (the universal CRT). If it still does that, then you
won't be able to share some of the POSIX compatibility features
between the two CRTs, such as file descriptors and the locale. Their
FILE stream records are also incompatible. Also, you'll have to be
certain to never free() memory that was allocated by a different CRT.
msvcrt uses a private heap, and ucrt uses the main process heap. For
example (with a debugger attached):

import ctypes
ucrt = ctypes.CDLL('ucrtbase', use_errno=True)
msvcrt = ctypes.CDLL('msvcrt', use_errno=True)
ucrt.malloc.restype = ctypes.c_void_p
msvcrt.free.argtypes = (ctypes.c_void_p,)

>>> b = ucrt.malloc(4096)
>>> msvcrt.free(b)

HEAP[python.exe]: Invalid address specified to
RtlFreeHeap( 024389C1, 024389A58FB0 )
(1b44.1ca0): Break instruction exception - code 8003 (first chance)
ntdll!RtlpBreakPointHeap+0x16:
7ff8`8baa511e cc  int 3
0:000>
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Reducing "yield from" overhead in recursive generators

2022-03-19 Thread Barry


> On 19 Mar 2022, at 03:07, Greg Ewing  wrote:
> 
> On 19/03/22 9:40 am, Rathmann wrote:
>> The other challenge/question would be to see if there is a way to implement
>> this basic approach with lower overhead.
> 
> My original implementation of yield-from didn't have this problem,
> because it didn't enter any of the intermediate frames -- it just
> ran down a chain of C pointers to find the currently active one.
> 
> At some point this was changed, I think so that all the frames
> would show up in the traceback in the event of an exception.
> This was evidently seen as more important than having efficient
> resumption of nested generators.

I would have thought that it would be possible to do both sets of 
pointer/accounting.
One that supports your original fast implementation and add to that the info to 
do the
trace back.

Barry

> 
> I'm not sure exactly how it works now, but I believe it involves
> re-executing the YIELD_FROM bytecode in each generator down the
> chain whenever a resumption occurs. This is nice and simple, but
> not very efficient.
> 
> Maybe another way could be found of preserving the traceback,
> such as temporarily splicing the chain of frames into the call
> stack and then resuming the last one.
> 
> -- 
> Greg
> -- 
> https://mail.python.org/mailman/listinfo/python-list
> 

-- 
https://mail.python.org/mailman/listinfo/python-list