Re: Problem with multiprocessing

2011-05-19 Thread Dan Stromberg
Share as little as possible between your various processes - shared, mutable state is a parallelism tragedy. If you can avoid sharing an entire dictionary, do so. It'd probably be better to dedicate one process to updating your dictionary, and then using a multiprocessing.Queue to pass delta reco

Problem with multiprocessing

2011-05-19 Thread Pietro Abate
Hi all, I'm a bit struggling to understand a KeyError raised by the multiprocessing library. My idea is pretty simple. I want to create a server that will spawn a number of workers that will share the same socket and handle requests independently. The goal is to build a 3-tier structure where a

Re: strange problem with multiprocessing

2010-11-11 Thread Marc Christiansen
Neal Becker wrote: > Any idea what this could be about? > > Traceback (most recent call last): > File "run-tests-1004.py", line 48, in >results = pool.map (run_test, cases) > File "/usr/lib64/python2.7/multiprocessing/pool.py", line 199, in map >return self.map_async(func, iterable

strange problem with multiprocessing

2010-11-11 Thread Neal Becker
Any idea what this could be about? Traceback (most recent call last): File "run-tests-1004.py", line 48, in results = pool.map (run_test, cases) File "/usr/lib64/python2.7/multiprocessing/pool.py", line 199, in map return self.map_async(func, iterable, chunksize).get() File "/us

Re: problem with multiprocessing and defaultdict

2010-01-12 Thread Wolodja Wentland
On Tue, Jan 12, 2010 at 11:48 +0100, wiso wrote: > They sent back the object filled with data. The problem is very simple: I > have a container, the container has a method read(file_name) that read a > huge file and fill the container with datas. I have more then 1 file to read > so I want to pa

Re: problem with multiprocessing and defaultdict

2010-01-12 Thread wiso
Robert Kern wrote: > On 2010-01-11 17:50 PM, wiso wrote: > >> The problem now is this: >> start reading file r1_200909.log >> start reading file r1_200910.log >> readen 488832 lines from file r1_200910.log >> readen 517247 lines from file r1_200909.log >> >> with huge file (the real case) the pro

Re: problem with multiprocessing and defaultdict

2010-01-11 Thread Robert Kern
On 2010-01-11 17:50 PM, wiso wrote: The problem now is this: start reading file r1_200909.log start reading file r1_200910.log readen 488832 lines from file r1_200910.log readen 517247 lines from file r1_200909.log with huge file (the real case) the program freeze. Is there a solution to avoid

Re: problem with multiprocessing and defaultdict

2010-01-11 Thread wiso
Robert Kern wrote: > On 2010-01-11 17:15 PM, wiso wrote: >> I'm using a class to read some data from files: >> >> import multiprocessing >> from collections import defaultdict >> >> def SingleContainer(): >> return list() >> >> >> class Container(defaultdict): >> """ >> this class s

Re: problem with multiprocessing and defaultdict

2010-01-11 Thread Robert Kern
On 2010-01-11 17:15 PM, wiso wrote: I'm using a class to read some data from files: import multiprocessing from collections import defaultdict def SingleContainer(): return list() class Container(defaultdict): """ this class store odd line in self["odd"] and even line in self["

problem with multiprocessing and defaultdict

2010-01-11 Thread wiso
I'm using a class to read some data from files: import multiprocessing from collections import defaultdict def SingleContainer(): return list() class Container(defaultdict): """ this class store odd line in self["odd"] and even line in self["even"]. It is stupid, but it's only a

Problem with multiprocessing managers

2010-01-06 Thread Metalone
>From the documentation for Using a remote manager there is the following example code: from multiprocessing.managers import BaseManager import Queue queue = Queue.Queue() class QueueManager(BaseManager): pass QueueManager.register('get_queue', callable=lambda:queue) m = QueueManager(address=('',

Re: Problem with multiprocessing

2009-09-03 Thread Tennessee
> Yes, to use the multiprocessing module, you must make your script > importable, so runtime statements should go into a __main__ > conditional. This way, when multiprocessing imports the module for each > of its threads, the actual runtime code only gets executed once in the > parent thread, whic

Re: Problem with multiprocessing

2009-09-02 Thread Roy Hyunjin Han
On 09/02/2009 04:51 AM, Peter Otten wrote: tleeuwenb...@gmail.com wrote: I have a problem using multiprocessing in a simple way. I created a file, testmp.py, with the following contents: --- import multiprocessing as mp p = mp.Pool(5) def f(x):

Re: Problem with multiprocessing

2009-09-02 Thread Peter Otten
tleeuwenb...@gmail.com wrote: > I have a problem using multiprocessing in a simple way. I created a > file, testmp.py, with the following contents: > > --- > import multiprocessing as mp > > p = mp.Pool(5) > > def f(x): > return x * x > > print

Problem with multiprocessing

2009-09-01 Thread tleeuwenb...@gmail.com
I have a problem using multiprocessing in a simple way. I created a file, testmp.py, with the following contents: --- import multiprocessing as mp p = mp.Pool(5) def f(x): return x * x print map(f, [1,2,3,4,5]) print p.map(f, [1,2,3,4,5]) -