Ronny Rentner <pyt...@ronny-rentner.de> added the comment:
Thanks for your quick response. My bigger scope is real time audio and video processing where I use multiple processes that need to be synchronized. I use shared memory for that. As a small spin off, I've hacked together a dict that is using shared memory as a storage. It works like this: It uses one shared memory space for streaming updates. This is efficient because only changes are transferred. Once the streaming shared memory buffer is full or if any single update to the dict is larger than the streaming buffer, it creates a full dump of the whole dict in a new shared memory that is just as big as needed. Any user of the dict would then consume the full dump. On Linux that works great. Any user of the dict can create a full dump in a new shared memory and all other users of the same dict can consume it. On Windows, the issue is if the creator process of the full dump goes away, the shared memory goes away. This is in contrast to the Python docs, unfortunately. I don't fully understand the underlying implementations, but I've been looking at https://docs.microsoft.com/en-us/dotnet/standard/io/memory-mapped-files and I understand there are 2 main modes. The persistent mode sounds just like Python shared memory also works on Linux (where I can have these /dev/shm/* files even after the Python process ends) but I think on Windows, Python is not using the persistent mode and thus the shared memory goes away, in contrast to how it works on Linux. PS: You can find the code for this shared dict here https://github.com/ronny-rentner/UltraDict - Please note, it's an eary lack and not well tested. ---------- _______________________________________ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue46888> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com