Tim Arnold wrote:
"Jan Kaliszewski" <z...@chopin.edu.pl> wrote in message
news:mailman.895.1251958800.2854.python-l...@python.org...
06:49:13 Scott David Daniels <scott.dani...@acm.org> wrote:
Tim Arnold wrote:
(1) what's wrong with having each chapter in a separate thread? Too
much going on for a single processor?
Many more threads than cores and you spend a lot of your CPU switching
tasks.
In fact, python threads work relatively the best with a powerful single
core; with more cores it becomes being suprisingly inefficient.
The culprit is Pythn GIL and the way it [mis]cooperates with OS
scheduling.
See: http://www.dabeaz.com/python/GIL.pdf
Yo
*j
--
Jan Kaliszewski (zuo) <z...@chopin.edu.pl>
I've read about the GIL (I think I understand the problem there)--thanks. In
my example, the actual job called for each chapter ended up being a call to
subprocess (that called a different python program). I figured that would
save me from the GIL problems since each subprocess would have its own GIL.
In the words of Tom Waits, " the world just keeps getting bigger when you
get out on your own". So I'm re-reading now, and maybe what I've been doing
would have been better served by the multiprocessing package.
I'm running python2.6 on FreeBSD with a dual quadcore cpu. Now my questions
are:
(1) what the heck should I be doing to get concurrent builds of the
chapters, wait for them all to finish, and pick up processing the main job
again? The separate chapter builds have no need for communication--they're
autonomous.
(2) using threads with the target fn calling subprocess, a bad idea?
(3) should I study up on multiprocessing package and/or pprocessing?
thanks for your inputs,
You could adapt the threading solution I gave to multiprocessing; just
use the multiprocessing queue class instead of the threading queue
class, etc.
--
http://mail.python.org/mailman/listinfo/python-list