On Wed, Jul 1, 2020 at 9:37 AM David Lowry-Duda <da...@lowryduda.com> wrote: > > I have quite a bit of experience with various programming languages, > including python, C, C++, and lisp. But I very rarely work on projects > that use interoperability. This is something that I'd like to learn more > about. > > Sometimes, for numerical computing, I use cython as a binding layer > between C++ and python, but that's the extent of my experience. I'm > under the impression that if one were to want to use something like lisp > (like say ecl embeddable common lisp), C, and python together, the > process boils down to writing a lot of wrappers in C. And more > generally, C is a frequent lowest-common-denominator language. Is my > understanding right? >
On most operating systems, there's some kind of binary executable format that is compatible with basically any language, but is most often described in C, so yes, C becomes the de facto language of these kinds of things. But you also have another very good option, and that is to keep all the binaries entirely separate, and use interprocess communication between them. For example, you could have a Python program that starts a Lisp interpreter as a subprocess, and communicates with it via stdin/stdout; or a TCP or Unix socket can be used for negotiating between groups of languages; or you can have everything push stuff into a PostgreSQL database. There are many options, depending on what kind of interaction you need. I'm a big fan of using networking to communicate (ie TCP sockets, or higher level constructs such as HTTP or WebSockets) as they can be used by all manner of programs, even those running in restricted environments such as inside a web browser. For instance, I have a Pike program that manages my Twitch channel bot, and it can signal a web page (JavaScript running in a web browser) using a WebSocket to have it do things, and a Python program can talk to it via the same sort of websocket to send it a message. Meanwhile a shell script can trigger things by invoking Python, etc, etc. Easy to change different parts out, because the communication lines are defined in simple terms like "HTTP with JSON payload", which can be understood easily by all languages. If you actually need real function calls (like you say with Cython being between C++ and Python), the hardest part is managing different data types. Python code running in CPython already has a rich API for this, but once you bring some other language into the mix, now you have to think about the differences between different integer types, or different string types. Unless your code is very carefully crafted, you will probably find that it's just as much trouble - and just as much of a performance hit - as it would be if you were using external communication. It takes a lot of planning to properly take advantage of polyglot intra-process communication :) ChrisA -- https://mail.python.org/mailman/listinfo/python-list