Gabriel Rossetti schrieb:
Gabriel Rossetti wrote:
Hello everyone,
I would like to know if it is possible to turn python code into a
shared lib? I have several processes that use the same base code, and
it seems like every process loads the "shared" code into memory. I
would like it to be loaded once and shared, like a .so in linux or a
.dll in windows and have the interpreters use the dared copy. Is there
a way to do this?
Thank you,
Gabriel
Ok, maybe I mis-stated my problem (or mis-understood your answers).. I
don' t want to share code as in have multiple processes access a
variable and have the same value, like it is done in threads, what I
want is to not have n copies of the code (classes, functions, etc)
loaded by n python interpreters. When you load Apache for instance, it
runs n processes but loads only one copy of parts of it's code (so/dll),
that's what I want to do. In C/C++ I would write a shared-lib (so/dll),
so once the system has loaded it, it doesn' t re-load it when another
process needs it.
This is not done, and most probably won't ever happen. What would you do
with this?
import random
import foo
if random.random() > .5:
foo.some_function = lambda x: x ** 2
else:
foo.some_function = lambda x: x + 2
If the module foo would be shared amongst processes, these would affect
the code of each other. Not so nice.
Of course one could try & cough up some read-only, copy-on-modify
scheme, but that would be hard, if not impossible - and for rather
minimal gains. Take a look at the compiled *.pyc-files (which should
pretty much represent the memory they consume) - they are in the kbytes.
So optimizing here would be a waste I'd say.
Diez
--
http://mail.python.org/mailman/listinfo/python-list