Marat Sharafutdinov <deca...@gmail.com> added the comment:

But why if I use multiprocessing (run 100 tasks by 100 workers) it still 
continue blocking loop within some workers? Are 100 tasks "a lot of work" for 
asyncio loop?

```python
import asyncio
from multiprocessing import Process

worker_count = 100
task_count = 100

def worker_main(worker_id):
    async def main():
        for x in range(1, task_count + 1):
            asyncio.ensure_future(f(x))

    async def f(x):
        if x % 1000 == 0 or x == task_count:
            print(f'[WORKER-{worker_id}] Run f({x})')
        await asyncio.sleep(1)
        loop.call_later(1, lambda: asyncio.ensure_future(f(x)))

    loop = asyncio.get_event_loop()
    loop.set_debug(True)
    loop.run_until_complete(main())
    loop.run_forever()

if __name__ == '__main__':
    for worker_id in range(worker_count):
        worker = Process(target=worker_main, args=(worker_id,), daemon=True)
        worker.start()
    while True:
        pass
```

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue33115>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to