Re: Catching exceptions with multi-processing

2015-06-21 Thread Paul Rubin
Fabien writes: > I am developing a tool which works on individual entities (glaciers) > and do a lot of operations on them. There are many tasks to do, one > after each other, and each task follows the same interface: ... If most of the resources will be spent on computation and the communication

Re: Catching exceptions with multi-processing

2015-06-20 Thread Fabien
On 06/20/2015 05:14 AM, Cameron Simpson wrote: I would keep your core logic Pythonic, raise exceptions. But I would wrap each task in something to catch any Exception subclass and report back to the queue. Untested example: def subwrapper(q, callable, *args, **kwargs): try: q.put( ('

Re: Catching exceptions with multi-processing

2015-06-20 Thread Fabien
On 06/19/2015 10:58 PM, Chris Angelico wrote: AIUI what he's doing is all the subparts of task1 in parallel, then all the subparts of task2: pool.map(task1, dirs, chunksize=1) pool.map(task2, dirs, chunksize=1) pool.map(task3, dirs, chunksize=1) task1 can be done on all of dirs in parallel, as

Re: Catching exceptions with multi-processing

2015-06-19 Thread Cameron Simpson
On 19Jun2015 18:16, Fabien wrote: On 06/19/2015 04:25 PM, Andres Riancho wrote: My recommendation is that you should pass some extra arguments to the task: * A unique task id * A result multiprocessing.Queue When an exception is raised you put (unique_id, exception) to the queue

Re: Catching exceptions with multi-processing

2015-06-19 Thread Chris Angelico
On Sat, Jun 20, 2015 at 1:41 AM, Steven D'Aprano wrote: > On Sat, 20 Jun 2015 12:01 am, Fabien wrote: > >> Folks, >> >> I am developing a tool which works on individual entities (glaciers) and >> do a lot of operations on them. There are many tasks to do, one after >> each other, and each task fol

Re: Catching exceptions with multi-processing

2015-06-19 Thread Fabien
On 06/19/2015 04:25 PM, Andres Riancho wrote: Fabien, My recommendation is that you should pass some extra arguments to the task: * A unique task id * A result multiprocessing.Queue When an exception is raised you put (unique_id, exception) to the queue. When it succeeds you

Re: Catching exceptions with multi-processing

2015-06-19 Thread Fabien
On 06/19/2015 05:41 PM, Steven D'Aprano wrote: On Sat, 20 Jun 2015 12:01 am, Fabien wrote: >Folks, > >I am developing a tool which works on individual entities (glaciers) and >do a lot of operations on them. There are many tasks to do, one after >each other, and each task follows the same inter

Re: Catching exceptions with multi-processing

2015-06-19 Thread Steven D'Aprano
On Sat, 20 Jun 2015 12:01 am, Fabien wrote: > Folks, > > I am developing a tool which works on individual entities (glaciers) and > do a lot of operations on them. There are many tasks to do, one after > each other, and each task follows the same interface: I'm afraid your description is contrad

Re: Catching exceptions with multi-processing

2015-06-19 Thread Jean-Michel Pichavant
- Original Message - > From: "Oscar Benjamin" > A simple way to approach this could be something like: > > #!/usr/bin/env python3 > > import math > import multiprocessing > > def sqrt(x): > if x < 0: > return 'error', x > else: > return 'success', math.sqrt(x) >

Re: Catching exceptions with multi-processing

2015-06-19 Thread Oscar Benjamin
On 19 June 2015 at 15:01, Fabien wrote: > Folks, > > I am developing a tool which works on individual entities (glaciers) and do > a lot of operations on them. There are many tasks to do, one after each > other, and each task follows the same interface: > > def task_1(path_to_glacier_dir): > o

Re: Catching exceptions with multi-processing

2015-06-19 Thread Jean-Michel Pichavant
- Original Message - > From: "Fabien" > To: python-list@python.org > Sent: Friday, 19 June, 2015 4:01:02 PM > Subject: Catching exceptions with multi-processing > > Folks, > > I am developing a tool which works on individual entities (glaciers) >

Re: Catching exceptions with multi-processing

2015-06-19 Thread Andres Riancho
Fabien, My recommendation is that you should pass some extra arguments to the task: * A unique task id * A result multiprocessing.Queue When an exception is raised you put (unique_id, exception) to the queue. When it succeeds you put (unique_id, None). In the main process you consu

Catching exceptions with multi-processing

2015-06-19 Thread Fabien
Folks, I am developing a tool which works on individual entities (glaciers) and do a lot of operations on them. There are many tasks to do, one after each other, and each task follows the same interface: def task_1(path_to_glacier_dir): open file1 in path_to_glacier_dir do stuff i