Do you do it as separate process or thread.
There is https://wiki.python.org/moin/GlobalInterpreterLock
so you need to spawn many processes
Best regards,
Marek Mosiewicz
http://marekmosiewicz.pl
W dniu 23.05.2019 o 20:39, Bob van der Poel pisze:
I've got a short script that
On Sun, May 26, 2019 at 11:05 AM Grant Edwards
wrote:
> On 2019-05-23, Chris Angelico wrote:
> > On Fri, May 24, 2019 at 5:37 AM Bob van der Poel
> wrote:
> >>
> >> I've got a short script that loops though a number of files and
> >> processes them one at a time. I had a bit of time today and f
On Mon, May 27, 2019 at 4:24 AM Grant Edwards wrote:
>
> On 2019-05-26, Chris Angelico wrote:
>
> > Sometimes, the "simple" and "obvious" code, the part that clearly has
> > no bugs in it, is the part that has the problem. :)
>
> And in this case, the critical part of the code that was actually
>
On 2019-05-26, Chris Angelico wrote:
> Sometimes, the "simple" and "obvious" code, the part that clearly has
> no bugs in it, is the part that has the problem. :)
And in this case, the critical part of the code that was actually
serializing everything wasn't shown. One strives to post problem
d
On Mon, May 27, 2019 at 4:06 AM Grant Edwards wrote:
>
> On 2019-05-23, Chris Angelico wrote:
> > On Fri, May 24, 2019 at 5:37 AM Bob van der Poel wrote:
> >>
> >> I've got a short script that loops though a number of files and
> >> processes them one at a time. I had a bit of time today and fig
On 2019-05-23, Chris Angelico wrote:
> On Fri, May 24, 2019 at 5:37 AM Bob van der Poel wrote:
>>
>> I've got a short script that loops though a number of files and
>> processes them one at a time. I had a bit of time today and figured
>> I'd rewrite the script to process the files 4 at a time by
On 24May2019 11:40, bvdp wrote:
Just got a 1 liner working with parallel. Super! All I ended up doing is:
parallel mma {} ::: *mma
+1
Glad to see a nice simple approach.
Cheers,
Cameron Simpson (formerly )
--
https://mail.python.org/mailman/listinfo/python-list
Just got a 1 liner working with parallel. Super! All I ended up doing is:
parallel mma {} ::: *mma
which whizzed through my files in less than 1/4 of the time of my
one-at-a-time script. (In case anyone is wondering, or cares, this is a
bunch of Musical Midi Accompaniment files:
https://mello
On 5/23/19 6:32 PM, Cameron Simpson wrote:
On 23May2019 17:04, bvdp wrote:
Anyway, yes the problem is that I was naively using command.getoutput()
which blocks until the command is finished. So, of course, only one
process
was being run at one time! Bad me!
I guess I should be looking at sub
Ahh, 2 really excellent ideas! I'm reading about parallel right now. And, I
know how to use make, so I really should have thought of -j as well. Thanks
for the ideas.
On Fri, May 24, 2019 at 12:02 AM Christian Gollwitzer
wrote:
> Am 23.05.19 um 23:44 schrieb Paul Rubin:
> > Bob van der Poel wri
Am 23.05.19 um 23:44 schrieb Paul Rubin:
Bob van der Poel writes:
for i in range(0, len(filelist), CPU_COUNT):
for z in range(i, i+CPU_COUNT):
doit( filelist[z])
Write your program to just process one file, then use GNU Parallel
to run the program on your 1200 files, 6 at a time
On 5/23/2019 2:39 PM, Bob van der Poel wrote:
I'm processing about 1200 files and my total duration is around 2 minutes.
A followup to my previous response, which has not shown up yet. The
python test suite is over 400 files. You might look at how
test.regrtest runs them in parallel when -
On 5/23/2019 2:39 PM, Bob van der Poel wrote:
I've got a short script that loops though a number of files and processes
them one at a time. I had a bit of time today and figured I'd rewrite the
script to process the files 4 at a time by using 4 different instances of
python.
As others have said
On 23May2019 17:04, bvdp wrote:
Anyway, yes the problem is that I was naively using command.getoutput()
which blocks until the command is finished. So, of course, only one process
was being run at one time! Bad me!
I guess I should be looking at subprocess.Popen(). Now, a more relevant
question
On Fri, May 24, 2019 at 10:48 AM MRAB wrote:
>
> On 2019-05-24 01:22, Chris Angelico wrote:
> > What I'd recommend is a thread pool. Broadly speaking, it would look
> > something like this:
> >
> > jobs = [...]
> >
> > def run_jobs():
> > while jobs:
> > try: job = jobs.pop()
> >
On 2019-05-24 01:22, Chris Angelico wrote:
On Fri, May 24, 2019 at 10:07 AM Bob van der Poel wrote:
Thanks all! The sound you are hearing is my head smacking against my hand!
Or is it my hand against my head?
Anyway, yes the problem is that I was naively using command.getoutput()
which blocks
On Fri, May 24, 2019 at 10:07 AM Bob van der Poel wrote:
>
> Thanks all! The sound you are hearing is my head smacking against my hand!
> Or is it my hand against my head?
>
> Anyway, yes the problem is that I was naively using command.getoutput()
> which blocks until the command is finished. So,
Thanks all! The sound you are hearing is my head smacking against my hand!
Or is it my hand against my head?
Anyway, yes the problem is that I was naively using command.getoutput()
which blocks until the command is finished. So, of course, only one process
was being run at one time! Bad me!
I gue
On 2019-05-23 22:41, Avi Gross via Python-list wrote:
Bob,
As others have noted, you have not made it clear how what you are doing is
running "in parallel."
I have a similar need where I have thousands of folders and need to do an
analysis based on the contents of one at a time and have 8 cores
months or years if run exhaustively on something like a
grid search trying huge numbers of combinations.
Good luck.
Avi
-Original Message-
From: Python-list On
Behalf Of Bob van der Poel
Sent: Thursday, May 23, 2019 2:40 PM
To: Python
Subject: More CPUs doen't equal more speed
I&#
NT versions running, only that many
running, and kicking off the next one once any of those completes?
-Original Message-
From: Python-list
[mailto:python-list-bounces+david.raymond=tomtom@python.org] On Behalf Of
Bob van der Poel
Sent: Thursday, May 23, 2019 2:40 PM
To: Python
S
On Fri, May 24, 2019 at 5:37 AM Bob van der Poel wrote:
>
> I've got a short script that loops though a number of files and processes
> them one at a time. I had a bit of time today and figured I'd rewrite the
> script to process the files 4 at a time by using 4 different instances of
> python. My
I've got a short script that loops though a number of files and processes
them one at a time. I had a bit of time today and figured I'd rewrite the
script to process the files 4 at a time by using 4 different instances of
python. My basic loop is:
for i in range(0, len(filelist), CPU_COUNT):
f
23 matches
Mail list logo