Hi,
both asyncio.as_completed() and asyncio.wait() work with lists only. No
generators are accepted. Are there anything similar to those functions that
pulls Tasks/Futures/coroutines one-by-one and processes them in a limited
task pool?
I have gazillion of Tasks, and do not want to instantiate th
Hi Maxime,
many thanks for your great solution. It would be so great to have it in
stock asyncio and use it out-of-the-box...
I've made 4 fixes to it that are rather of "cosmetic" nature. Here is the
final version:
import asyncio
from concurrent import futures
def as_completed_with_max_workers(
Hi all
I am trying to use asyncio in real applications and it doesn't go that
easy, a help of asyncio gurus is needed badly.
Consider a task like crawling the web starting from some web-sites. Each
site leads to generation of new downloading tasks in exponential(!)
progression. However we don't w
Hi all
things like urllib.quote(u"пиво Müller ") fail with error message:
: u'\u043f'
Similarly with urllib2.
Anyone got a hint?? I need it to form the URI containing non-ascii chars.
thanks in advance,
best regards
--
Valery
--
http://mail.python.org/mailman/listinfo/python-list
Hi,
multithreading.pool Pool has a promissing initializer argument in its
constructor.
However it doesn't look possible to use it to initialize each Pool's
worker with some individual value (I'd wish to be wrong here)
So, how to initialize each multithreading Pool worker with the
individual value
Hi Dan,
> If you create in the parent a queue in shared memory (multiprocessing
> facilitates this nicely), and fill that queue with the values in your
> ports tuple, then you could have each child in the worker pool extract
> a single value from this queue so each worker can have its own, unique