[issue41505] asyncio.gather of large streams with limited resources

2020-08-23 Thread Kevin Amado
Kevin Amado added the comment: Yeah definitely it must be workers I've experimented a lot about it and finally found something with an interface similar to asyncio.as_completed - You control concurrency with `workers` parameter - You upper-bound memory usage with `worker_greediness` parameter

[issue41505] asyncio.gather of large streams with limited resources

2020-08-22 Thread Caleb Hattingh
Caleb Hattingh added the comment: The traditional way this done is with a finite number of workers pulling work off a queue. This is straightforward to set up with builtins: from uuid import uuid4 import asyncio, random async def worker(q: asyncio.Queue): while job := await q.get():

[issue41505] asyncio.gather of large streams with limited resources

2020-08-07 Thread Kevin Amado
Change by Kevin Amado : Removed file: https://bugs.python.org/file49377/materialize-implementation.py ___ Python tracker ___ ___ Python-bugs

[issue41505] asyncio.gather of large streams with limited resources

2020-08-07 Thread Kevin Amado
New submission from Kevin Amado : Sometimes when dealing with high concurrency systems developers face the problem of executing concurrently a large number of tasks while taking care of a finite pool of resources Just to mention some examples: - reading asynchronously a lot of files without e