On 01/08/2012 08:46 PM, Yigit Turgut wrote:
On Jan 9, 12:02 am, Dave Angel wrote:
Then i'd try calling separate functions (declaring them in depfuncs).
And finally I'd try some 3rd party library.
Don't think will try another package for the same task. I am now
moving on to PP + PyCUDA to harn
Yigit Turgut wrote:
>
> > There are no imports other than defined on the script, which are;
>
> > import pygame
> > import sys
> > import time
> > import math
> > import pp
>
> > You are correct about trying to pass two functions and second one
s one argument the other doesn't. To get familiar with parallel
processing I am experimenting now without arguments and then I will
embed the code to my application. I am experimenting with the
following ;
import pygame
import sys
import time
import math
import pp
screen = pygame.display.set_
On 01/08/2012 11:39 AM, Yigit Turgut wrote:
screen = pygame.display.set_mode((0, 0), pygame.FULLSCREEN)
timer = pygame.time.Clock()
white = True
start = time.time()
end = time.time() - start
end2= time.time() - start
def test1():
global end
global white
while(end<5):
end = time.ti
On Jan 8, 6:00 pm, Chris Angelico wrote:
> On Mon, Jan 9, 2012 at 2:45 AM, Yigit Turgut wrote:
> > job1 = job_server.submit(test1,())
> > job2 = job_server.submit(test2())
>
> The first of these passes test1 and an empty tuple as arguments to
> submit(). The second calls test2 with no arguments,
On Mon, Jan 9, 2012 at 2:45 AM, Yigit Turgut wrote:
> job1 = job_server.submit(test1,())
> job2 = job_server.submit(test2())
The first of these passes test1 and an empty tuple as arguments to
submit(). The second calls test2 with no arguments, then passes its
return value to submit(), which is no
On Jan 8, 4:34 pm, Dave Angel wrote:
> On 01/08/2012 08:23 AM, Yigit Turgut wrote:
>
>
>
>
>
>
>
> > Hi all,
>
> > I am trying to run two functions at the same time with Parallel
> > Processing (pp) as following ;
>
> > import pygame
> &
On 01/08/2012 08:23 AM, Yigit Turgut wrote:
Hi all,
I am trying to run two functions at the same time with Parallel
Processing (pp) as following ;
import pygame
import sys
import time
import math
import pp
screen = pygame.display.set_mode((0, 0), pygame.FULLSCREEN)
timer = pygame.time.Clock
Hi all,
I am trying to run two functions at the same time with Parallel
Processing (pp) as following ;
import pygame
import sys
import time
import math
import pp
screen = pygame.display.set_mode((0, 0), pygame.FULLSCREEN)
timer = pygame.time.Clock()
white = True
start = time.time()
end
Hi All,
Is there is a way to print or use the value of new in the main function of
the script below?
from thread import start_new_thread, allocate_lock
num_threads = 0
thread_started = False
lock = allocate_lock()
def heron(a):
global num_threads, thread_started
lock.acquire()
num_
On Mar 24, 1:13 pm, Jon Clements wrote:
> On 24 Mar, 15:27, Glazner wrote:
>
>
>
> > Hi!
>
> > I need to replace an app that does number crunching over a local
> > network.
> > it have about 50 computers as slaves
> > each computer needs to run COM that will do the "job"
> > right now the system
have you checked hadoop ?
On Wed, Mar 24, 2010 at 11:43 PM, Jon Clements wrote:
> On 24 Mar, 15:27, Glazner wrote:
> > Hi!
> >
> > I need to replace an app that does number crunching over a local
> > network.
> > it have about 50 computers as slaves
> > each computer needs to run COM that will d
On 24 Mar, 15:27, Glazner wrote:
> Hi!
>
> I need to replace an app that does number crunching over a local
> network.
> it have about 50 computers as slaves
> each computer needs to run COM that will do the "job"
> right now the system uses MFC threads and DCOM to distribute the load.
>
> as i sa
Hi!
I need to replace an app that does number crunching over a local
network.
it have about 50 computers as slaves
each computer needs to run COM that will do the "job"
right now the system uses MFC threads and DCOM to distribute the load.
as i said, Now i'm trying to replace this system with pyt
wrote:
> I'm filing 160 million data points into a set of bins based on their
> position. At the moment, this takes just over an hour using interval
So why do you not make four sets of bins - one for each core of your quad,
and split the points into quarters, and run four processes, and merge
psaff...@googlemail.com wrote:
I'm filing 160 million data points into a set of bins based on their
position. At the moment, this takes just over an hour using interval
trees. I would like to parallelise this to take advantage of my quad
core machine. I have some experience of Parallel Python, bu
I'm filing 160 million data points into a set of bins based on their
position. At the moment, this takes just over an hour using interval
trees. I would like to parallelise this to take advantage of my quad
core machine. I have some experience of Parallel Python, but PP seems
to only really work fo
Emin.shopper Martinian.shopper wrote:
> Is there any hope of a parallel processing toolkit being
> incorporated into the python standard library? I've seen a wide
> variety of toolkits each with various features and limitations.
> Unfortunately, each has its own API. F
then I worry that the discussion would
> devolve into an argument about the pros and cons of the particular
> implementation instead of the API. Even worse, it might devolve into an
> argument of the value of fine-grained vs. coarse-grained parallelism or
> the GIL. Considering that
flavor of SQL, etc.
>> and only secondarily differentiated by their extensions to the DB-API. With
>> parallel processing, the API itself is a key differentiator between toolkits
>> and
>> approaches. Different problems require different APIs, not just different
>> imp
opper
<[EMAIL PROTECTED]> wrote:
> Dear Experts,
>
> Is there any hope of a parallel processing toolkit being incorporated into
> the python standard library? I've seen a wide variety of toolkits each with
> various features and limitations. Unfortunately, each has its own API. F
rentiated by their extensions to the DB-API.
> With
> parallel processing, the API itself is a key differentiator between
> toolkits and
> approaches. Different problems require different APIs, not just different
> implementations.
I disagree. Most of the implementations of coarse-grained
Christian Heimes wrote:
> Stefan Behnel wrote:
> > Well, there is one parallel processing API that already *is* part of
> stdlib:
>> the threading module. So the processing module would fit just nicely into the
>> idea of a "standard" library.
>
> Do
Stefan Behnel wrote:
> Well, there is one parallel processing API that already *is* part of
stdlib:
> the threading module. So the processing module would fit just nicely into the
> idea of a "standard" library.
Don't you forget the select module and its siblings f
nly secondarily differentiated by their extensions to the DB-API. With
> parallel processing, the API itself is a key differentiator between toolkits
> and
> approaches. Different problems require different APIs, not just different
> implementations.
Well, there is one parallel processing API that
Emin.shopper Martinian.shopper wrote:
> Dear Experts,
>
> Is there any hope of a parallel processing toolkit being incorporated
> into the python standard library? I've seen a wide variety of toolkits
> each with various features and limitations. Unfortunately, each has its
&
Dear Experts,
Is there any hope of a parallel processing toolkit being incorporated into
the python standard library? I've seen a wide variety of toolkits each with
various features and limitations. Unfortunately, each has its own API. For
coarse-grained parallelism, I suspect I'd be pr
27 matches
Mail list logo