I'm setting up a system that consists of several small python applications that all communicate amongst each other on the same pc.
When running in Windows, launching each application generates a process, and each of those processes ends up taking up > 4MB of system memory. This memory usage is as reported by the Windows Task manager for the python.exe image name. My Question: Is there any way to reduce this per-process overhead? eg: can you set it somehow so that one python.exe instance handles multiple processes? One possibility considered is to run them as threads of a single process rather than multiple processes, but this has other drawbacks for my application and I'd rather not, Another possibility I considered is to strip out all but the most essential imports in each app, but I tested this out and it has marginal benefits. I demonstrated to myself that a simple one liner app consisting of 'x = raw_input()' still eats up > 2.7MB . I also tried -O but it, not surprisingly, did nothing for the one-liner. I'm simply running the .py files and I am still on v2.3 All help appreciated! Thanks, Russ -- http://mail.python.org/mailman/listinfo/python-list