For such tasks my choice would be twisted combined with ampoule.
Let's you spread out work to whatever amount of processes you desire,
maxing out whatever iron you're sitting on..
HTH, Werner
http://twistedmatrix.com/trac/
https://launchpad.net/ampoule
On 29.05.2012 16:43, Jabba Laci wrote:
Hehe, I just asked this question a few days ago but I didn't become
much cleverer:
http://www.gossamer-threads.com/lists/python/python/985701
Best,
Laszlo
On Thu, May 10, 2012 at 2:14 PM, Jabba Laci wrote:
> Hi,
>
> I would like to do some parallel programming with Python but I don't
> know ho
On 05/10/2012 06:46 AM, Devin Jeanpierre wrote:
> On Thu, May 10, 2012 at 8:14 AM, Jabba Laci wrote:
>> What's the best way?
>
>>From what I've heard, http://scrapy.org/ . It is a single-thread
> single-process web crawler that nonetheless can download things
> concurrently.
Yes, for i/o bound t
Hi,
Thanks for the answer. I use Linux with CPython 2.7. I plan to work
with CPU bound and I/O bound problems too. Which packages to use in
these cases? Could you redirect me to some guides? When to use
multiprocessing / gevent?
Thanks,
Laszlo
On Thu, May 10, 2012 at 2:34 PM, Dave Angel wrote
On Thu, May 10, 2012 at 8:14 AM, Jabba Laci wrote:
> What's the best way?
>From what I've heard, http://scrapy.org/ . It is a single-thread
single-process web crawler that nonetheless can download things
concurrently.
Doing what you want in Scrapy would probably involve learning about
Twisted, t
On 05/10/2012 08:14 AM, Jabba Laci wrote:
> Hi,
>
> I would like to do some parallel programming with Python but I don't
> know how to start. There are several ways to go but I don't know what
> the differences are between them: threads, multiprocessing, gevent,
> etc.
>
> I want to use a single ma