On Tue, Nov 3, 2020 at 3:27 AM Frank Millman wrote:
> It works, and it does look neater. But I want to start some background
> tasks before starting the server, and cancel them on Ctrl+C.
>
> Using the 'old' method, I can wrap 'loop.run_forever()' in a
> try/except/finally, check for KeyboardInt
Hi all
My app runs an HTTP server using asyncio. A lot of the code dates back
to Python 3.4, and I am trying to bring it up to date. There is one
aspect I do not understand.
The 'old' way looks like this -
import asyncio
def main():
loop = asyncio.get_event_loop()
se
On 2020-02-28 1:37 AM, rmli...@riseup.net wrote:
> What resources are you trying to conserve?
>
> If you want to try conserving time, you shouldn't have to worry about
> starting too many background tasks. That's because asyncio code was
> designed to be extremely time efficient at handling larg
What resources are you trying to conserve?
If you want to try conserving time, you shouldn't have to worry about
starting too many background tasks. That's because asyncio code was
designed to be extremely time efficient at handling large numbers of
concurrent async tasks.
For your application
On 2020-02-21 11:13 PM, Greg Ewing wrote:
On 21/02/20 7:59 pm, Frank Millman wrote:
My first attempt was to create a background task for each session
which runs for the life-time of the session, and 'awaits' its queue.
It works, but I was concerned about having a lot a background tasks
active
On 21/02/20 7:59 pm, Frank Millman wrote:
My first attempt was to create a background task for each session which
runs for the life-time of the session, and 'awaits' its queue. It works,
but I was concerned about having a lot a background tasks active at the
same time.
The whole point of asyn
Hi all
I use asyncio in my project, and it works very well without my having to
understand what goes on under the hood. It is a multi-user client/server
system, and I want it to scale to many concurrent users. I have a
situation where I have to decide between two approaches, and I want to
cho
> -Original Message-
> From: Python-list bounces+jcasale=activenetwerx@python.org> On Behalf Of Simon
> Connah
> Sent: Thursday, March 14, 2019 3:03 AM
> To: Python
> Subject: asyncio Question
>
> Hi,
>
> Hopefully this isn't a stupid ques
Hi,
Hopefully this isn't a stupid question. For the record I am using Python
3.7 on Ubuntu Linux.
I've decided to use asyncio to write a TCP network server using Streams
and asyncio.start_server(). I can handle that part of it without many
problems as the documentation is pretty good. I have
"Ian Kelly" wrote in message
news:CALwzid=vdczAH18mHKaL7ryvDUB=7_y-JVUrTkRZ=gkz66p...@mail.gmail.com...
On Tue, Dec 13, 2016 at 6:15 AM, Frank Millman wrote:
> The client uses AJAX to send messages to the server. It sends the
> message
> and continues processing, while a background task waits
On Tue, Dec 13, 2016 at 6:15 AM, Frank Millman wrote:
> The client uses AJAX to send messages to the server. It sends the message
> and continues processing, while a background task waits for the response and
> handles it appropriately. As a result, the client can send a second message
> before re
Hi all
I had a problem with asyncio - not a programming problem, but one with
organising my code to achieve a given result.
I have come up with a solution, but thought I would mention it here to see
if there is a better approach.
I am using asyncio.start_server() to run a simple HTTP server
I have a portion of code I need to speed up, there are 3 api calls to an
external system
where the first enumerates a large collection of objects I then loop through
and perform
two additional api calls each. The first call is instant, the second and third
per object are
very slow. Currently aft
13 matches
Mail list logo