Share as little as possible between your various processes - shared, mutable
state is a parallelism tragedy.
If you can avoid sharing an entire dictionary, do so. It'd probably be
better to dedicate one process to updating your dictionary, and then using a
multiprocessing.Queue to pass delta reco
Hi all,
I'm a bit struggling to understand a KeyError raised by the multiprocessing
library.
My idea is pretty simple. I want to create a server that will spawn a number of
workers that will share the same socket and handle requests independently. The
goal is to build a 3-tier structure where a
Neal Becker wrote:
> Any idea what this could be about?
>
> Traceback (most recent call last):
> File "run-tests-1004.py", line 48, in
>results = pool.map (run_test, cases)
> File "/usr/lib64/python2.7/multiprocessing/pool.py", line 199, in map
>return self.map_async(func, iterable
Any idea what this could be about?
Traceback (most recent call last):
File "run-tests-1004.py", line 48, in
results = pool.map (run_test, cases)
File "/usr/lib64/python2.7/multiprocessing/pool.py", line 199, in map
return self.map_async(func, iterable, chunksize).get()
File "/us
On Tue, Jan 12, 2010 at 11:48 +0100, wiso wrote:
> They sent back the object filled with data. The problem is very simple: I
> have a container, the container has a method read(file_name) that read a
> huge file and fill the container with datas. I have more then 1 file to read
> so I want to pa
Robert Kern wrote:
> On 2010-01-11 17:50 PM, wiso wrote:
>
>> The problem now is this:
>> start reading file r1_200909.log
>> start reading file r1_200910.log
>> readen 488832 lines from file r1_200910.log
>> readen 517247 lines from file r1_200909.log
>>
>> with huge file (the real case) the pro
On 2010-01-11 17:50 PM, wiso wrote:
The problem now is this:
start reading file r1_200909.log
start reading file r1_200910.log
readen 488832 lines from file r1_200910.log
readen 517247 lines from file r1_200909.log
with huge file (the real case) the program freeze. Is there a solution to
avoid
Robert Kern wrote:
> On 2010-01-11 17:15 PM, wiso wrote:
>> I'm using a class to read some data from files:
>>
>> import multiprocessing
>> from collections import defaultdict
>>
>> def SingleContainer():
>> return list()
>>
>>
>> class Container(defaultdict):
>> """
>> this class s
On 2010-01-11 17:15 PM, wiso wrote:
I'm using a class to read some data from files:
import multiprocessing
from collections import defaultdict
def SingleContainer():
return list()
class Container(defaultdict):
"""
this class store odd line in self["odd"] and even line in self["
I'm using a class to read some data from files:
import multiprocessing
from collections import defaultdict
def SingleContainer():
return list()
class Container(defaultdict):
"""
this class store odd line in self["odd"] and even line in self["even"].
It is stupid, but it's only a
>From the documentation for Using a remote manager there is the
following example code:
from multiprocessing.managers import BaseManager
import Queue
queue = Queue.Queue()
class QueueManager(BaseManager): pass
QueueManager.register('get_queue', callable=lambda:queue)
m = QueueManager(address=('',
> Yes, to use the multiprocessing module, you must make your script
> importable, so runtime statements should go into a __main__
> conditional. This way, when multiprocessing imports the module for each
> of its threads, the actual runtime code only gets executed once in the
> parent thread, whic
On 09/02/2009 04:51 AM, Peter Otten wrote:
tleeuwenb...@gmail.com wrote:
I have a problem using multiprocessing in a simple way. I created a
file, testmp.py, with the following contents:
---
import multiprocessing as mp
p = mp.Pool(5)
def f(x):
tleeuwenb...@gmail.com wrote:
> I have a problem using multiprocessing in a simple way. I created a
> file, testmp.py, with the following contents:
>
> ---
> import multiprocessing as mp
>
> p = mp.Pool(5)
>
> def f(x):
> return x * x
>
> print
I have a problem using multiprocessing in a simple way. I created a
file, testmp.py, with the following contents:
---
import multiprocessing as mp
p = mp.Pool(5)
def f(x):
return x * x
print map(f, [1,2,3,4,5])
print p.map(f, [1,2,3,4,5])
-
15 matches
Mail list logo