Re: Parallel Processing

2012-01-08 Thread Dave Angel
On 01/08/2012 08:46 PM, Yigit Turgut wrote: On Jan 9, 12:02 am, Dave Angel wrote: Then i'd try calling separate functions (declaring them in depfuncs). And finally I'd try some 3rd party library. Don't think will try another package for the same task. I am now moving on to PP + PyCUDA to harn

Re: Parallel Processing

2012-01-08 Thread Yigit Turgut
On Jan 9, 12:02 am, Dave Angel wrote: > On 01/08/2012 11:39 AM, Yigit Turgut wrote: > > > > > > > > > > > screen = pygame.display.set_mode((0, 0), pygame.FULLSCREEN) > > timer = pygame.time.Clock() > > white = True > > start = time.time() > > end = time.time() - start > > end2= time.time() - start

Re: Parallel Processing

2012-01-08 Thread David Hoese
On 1/8/12 1:45 PM, Yigit Turgut wrote: There are no imports other than defined on the script, which are; import pygame import sys import time import math import pp You are correct about trying to pass two functions and second one is in place where a tuple of arguments supposed to be. But what

Re: Parallel Processing

2012-01-08 Thread Dave Angel
On 01/08/2012 11:39 AM, Yigit Turgut wrote: screen = pygame.display.set_mode((0, 0), pygame.FULLSCREEN) timer = pygame.time.Clock() white = True start = time.time() end = time.time() - start end2= time.time() - start def test1(): global end global white while(end<5): end = time.ti

Re: Parallel Processing

2012-01-08 Thread Yigit Turgut
On Jan 8, 6:00 pm, Chris Angelico wrote: > On Mon, Jan 9, 2012 at 2:45 AM, Yigit Turgut wrote: > > job1 = job_server.submit(test1,()) > > job2 = job_server.submit(test2()) > > The first of these passes test1 and an empty tuple as arguments to > submit(). The second calls test2 with no arguments,

Re: Parallel Processing

2012-01-08 Thread Chris Angelico
On Mon, Jan 9, 2012 at 2:45 AM, Yigit Turgut wrote: > job1 = job_server.submit(test1,()) > job2 = job_server.submit(test2()) The first of these passes test1 and an empty tuple as arguments to submit(). The second calls test2 with no arguments, then passes its return value to submit(), which is no

Re: Parallel Processing

2012-01-08 Thread Yigit Turgut
On Jan 8, 4:34 pm, Dave Angel wrote: > On 01/08/2012 08:23 AM, Yigit Turgut wrote: > > > > > > > > > Hi all, > > > I am trying to run two functions at the same time with Parallel > > Processing (pp) as following ; > > > import pygame > > import sys > > import time > > import math > > import pp > >

Re: Parallel Processing

2012-01-08 Thread Dave Angel
On 01/08/2012 08:23 AM, Yigit Turgut wrote: Hi all, I am trying to run two functions at the same time with Parallel Processing (pp) as following ; import pygame import sys import time import math import pp screen = pygame.display.set_mode((0, 0), pygame.FULLSCREEN) timer = pygame.time.Clock()

Re: Parallel processing on shared data structures

2009-03-20 Thread Hendrik van Rooyen
wrote: > I'm filing 160 million data points into a set of bins based on their > position. At the moment, this takes just over an hour using interval So why do you not make four sets of bins - one for each core of your quad, and split the points into quarters, and run four processes, and merge

Re: Parallel processing on shared data structures

2009-03-19 Thread MRAB
psaff...@googlemail.com wrote: I'm filing 160 million data points into a set of bins based on their position. At the moment, this takes just over an hour using interval trees. I would like to parallelise this to take advantage of my quad core machine. I have some experience of Parallel Python, bu

Re: parallel processing in standard library

2008-01-01 Thread Konrad Hinsen
Emin.shopper Martinian.shopper wrote: > Is there any hope of a parallel processing toolkit being > incorporated into the python standard library? I've seen a wide > variety of toolkits each with various features and limitations. > Unfortunately, each has its own API. For coarse-grained > p

Re: parallel processing in standard library

2007-12-28 Thread Robert Kern
Emin.shopper Martinian.shopper wrote: > On Dec 27, 2007 4:13 PM, Robert Kern <[EMAIL PROTECTED] > > wrote: > My recommendation to you is to pick one of the smaller > implementations that > solves the problems in front of you. Read and understand that module >

Re: parallel processing in standard library

2007-12-28 Thread Robert Kern
Stefan Behnel wrote: > Robert Kern wrote: >> The problem is that for SQL databases, there is a substantial API that they >> can >> all share. The implementations are primarily differentiated by other factors >> like speed, in-memory or on-disk, embedded or server, the flavor of SQL, etc. >> and on

Re: parallel processing in standard library

2007-12-28 Thread Calvin Spealman
I think we are a ways off from the point where any of the solutions are well used, matured, and trusted to promote as a Python standard module. I'd love to see it happen, but even worse than it never happening is it happening too soon. On Dec 27, 2007 8:52 AM, Emin.shopper Martinian.shopper <[EMAI

Re: parallel processing in standard library

2007-12-28 Thread Emin.shopper Martinian.shopper
On Dec 27, 2007 4:13 PM, Robert Kern <[EMAIL PROTECTED]> wrote: > Emin.shopper Martinian.shopper wrote: > > If not, is there any hope of something like > > the db-api for coarse grained parallelism (i.e, a common API that > > different toolkits can support)? > > The problem is that for SQL databas

Re: parallel processing in standard library

2007-12-28 Thread Stefan Behnel
Christian Heimes wrote: > Stefan Behnel wrote: > > Well, there is one parallel processing API that already *is* part of > stdlib: >> the threading module. So the processing module would fit just nicely into the >> idea of a "standard" library. > > Don't you forget the select module and its siblin

Re: parallel processing in standard library

2007-12-28 Thread Christian Heimes
Stefan Behnel wrote: > Well, there is one parallel processing API that already *is* part of stdlib: > the threading module. So the processing module would fit just nicely into the > idea of a "standard" library. Don't you forget the select module and its siblings for I/O bound concurrency? Chris

Re: parallel processing in standard library

2007-12-28 Thread Stefan Behnel
Robert Kern wrote: > The problem is that for SQL databases, there is a substantial API that they > can > all share. The implementations are primarily differentiated by other factors > like speed, in-memory or on-disk, embedded or server, the flavor of SQL, etc. > and only secondarily differentiate

Re: parallel processing in standard library

2007-12-27 Thread Robert Kern
Emin.shopper Martinian.shopper wrote: > Dear Experts, > > Is there any hope of a parallel processing toolkit being incorporated > into the python standard library? I've seen a wide variety of toolkits > each with various features and limitations. Unfortunately, each has its > own API. For coarse-g