Hi,
Adam Tauno Williams wrote:
[...]
Here's the guts of my latest incarnation.
def ProcessBatch(files):
p = []
for file in files:
p.append(Process(target=ProcessFile,args=file))
for x in p:
x.start()
for x in p:
x.join()
p = []
return
Now, the fun
Hi Doxa,
DoxaLogos wrote:
[...]
I found out my problems. One thing I did was followed the test queue
example in the documentation, but the biggest problem turned out to be
a pool instantiated globally in my script was causing most of the
endless process spawn, even with the "if __name__ == "__m
On Jan 19, 10:33 am, DoxaLogos wrote:
> On Jan 19, 10:26 am, Adam Tauno Williams
> wrote:
>
>
>
> > > I decided to play around with the multiprocessing module, and I'm
> > > having some strange side effects that I can't explain. It makes me
> > > wonder if I'm just overlooking something obvious
On Jan 19, 10:26 am, Adam Tauno Williams
wrote:
> > I decided to play around with the multiprocessing module, and I'm
> > having some strange side effects that I can't explain. It makes me
> > wonder if I'm just overlooking something obvious or not. Basically, I
> > have a script parses through
> I decided to play around with the multiprocessing module, and I'm
> having some strange side effects that I can't explain. It makes me
> wonder if I'm just overlooking something obvious or not. Basically, I
> have a script parses through a lot of files doing search and replace
> on key strings