raunakgu...@gmail.com wrote:
I have some Pickled data, which is stored on disk, and it is about 100 MB in 
size.

When my python program is executed, the picked data is loaded using the cPickle 
module, and all that works fine.

If I execute the python multiple times using python main.py for example, each 
python process will load the same data multiple times, which is the correct 
behaviour.

How can I make it so, all new python process share this data, so it is only 
loaded a single time into memory?

asked the same question on SO, but could not get any constructive responses.. 
http://stackoverflow.com/questions/10550870/sharing-data-in-python/10551845
Well a straightforward way of solving this is to use threads.

Have your main program spawn threads for processing data.
Now what triggers a thread is up to you, your main program can listen to commands, or catch keyboard events ...

If you want to make sure your using all CPU cores, you may want to use the module
http://docs.python.org/library/multiprocessing.html

It emulates threads with subprocesses, and features a way to share memory.
I do tend to prefer processes over threads, I find them easier to monitor and control.

Cheers,

JM
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to