Channels 2: Consumer lifecycle when run in a worker process

2018-05-02 Thread Alexander Prokhorov
Dear Andrew,

I would like to ask couple of questions about the lifecycle of consumers 
running in Channels workers and serving custom channels.

Consider a consumer:

# myconsumer.py
class MyConsumer(channels.consumer.AsyncConsumer):
async def wakeup(self, message):
 await some_process()
 raise channels.exceptions.StopConsumer()

which I "register" to process messages in the channel `my_channel`:

# routing.py
application = channels.routing.ProtocolTypeRouter({
'channel': channels.routing.ChannelNameRouter({
'my_channel': MyConsumer
})
})

and eventually I run designated Channels worker to process `my_channel` 
messages.

./manage.py runworker my_channel

So the questions are:

   - Will `MyConsumer` receive new `wakeup` messages while awaiting `
   some_process`?
   - When do I need to raise `StopConsumer`? I can do it after each 
   processing of `wakeup` message (like in the code above) is that correct? 
   What will happen with all the `pending` messages in such case?

Actually, I do not raise "StopConsumer" in the implementation I currently 
have, but this leads to an issue with tests. In tests I need to somehow 
wait until all workers finish processing their messages. I tried calling 
`channels.testing.ApplicationCommunicator.wait()` but as I see it from the 
code it waits the application/consumer to finish, i.e. to raise 
`StopConsumer` exception. Probably you can share some recommendations. 
Thanks in advance.

Best regards,
Alexander.

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/00e6398b-a71f-4509-a95b-3ced88b26ee0%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Channels 2: Consumer lifecycle when run in a worker process

2018-05-02 Thread Alexander Prokhorov
Andrew, thank you for quick response!

Unfortunately I see something different. If I raise `StopConsumer` after 
processing a single message, the next one is lost. From the code 
https://github.com/django/asgiref/blob/master/asgiref/server.py#L75 I see 
that `get_or_create_application_instance` returns existing application 
instance, so when message arrives it is put into the queue of the existing 
application. If I raise `StopConsumer` this application got killed and the 
message is lost.

Another observation: when message handler awaits something (like in my 
snippet `await some_process()`) this application instance does not process 
new messages. When one message handler function exits - the next one starts.

Actually, I was offloading message processing to the threadpool (using `
sync_to_async`) and trying to limit the number of messages being processed 
at the same time. I hoped that if I await for some coroutine the worker 
will stop accepting messages, so other workers can process them until this 
one gets free again. Can you give me a clue how to achieve this bevaviour?


среда, 2 мая 2018 г., 18:39:01 UTC+3 пользователь Andrew Godwin написал:
>
>
>- Will `MyConsumer` receive new `wakeup` messages while awaiting `
>some_process`?
>
> Yes. The worker server runs as many application instances as there are 
> messages coming in, even though they all have the same scope. You can see 
> the main listening loop here: 
> https://github.com/django/channels/blob/master/channels/worker.py#L32
>
>- When do I need to raise `StopConsumer`? I can do it after each 
>processing of `wakeup` message (like in the code above) is that 
>correct? What will happen with all the `pending` messages in such case?
>
> You need to raise it when the specific application instance you have is 
> completed - because there's a different instance for each message, that 
> means "raise when you've finished processing the message". Nothing happens 
> to other messages as they're being handled by different instances.
>
> Andrew
>
> On Wed, May 2, 2018 at 7:50 AM Alexander Prokhorov  > wrote:
>
>> Dear Andrew,
>>
>> I would like to ask couple of questions about the lifecycle of consumers 
>> running in Channels workers and serving custom channels.
>>
>> Consider a consumer:
>>
>> # myconsumer.py
>> class MyConsumer(channels.consumer.AsyncConsumer):
>> async def wakeup(self, message):
>>  await some_process()
>>  raise channels.exceptions.StopConsumer()
>>
>> which I "register" to process messages in the channel `my_channel`:
>>
>> # routing.py
>> application = channels.routing.ProtocolTypeRouter({
>> 'channel': channels.routing.ChannelNameRouter({
>> 'my_channel': MyConsumer
>> })
>> })
>>
>> and eventually I run designated Channels worker to process `my_channel` 
>> messages.
>>
>> ./manage.py runworker my_channel
>>
>> So the questions are:
>>
>>- Will `MyConsumer` receive new `wakeup` messages while awaiting `
>>some_process`?
>>- When do I need to raise `StopConsumer`? I can do it after each 
>>processing of `wakeup` message (like in the code above) is that 
>>correct? What will happen with all the `pending` messages in such 
>>case?
>>
>> Actually, I do not raise "StopConsumer" in the implementation I 
>> currently have, but this leads to an issue with tests. In tests I need to 
>> somehow wait until all workers finish processing their messages. I tried 
>> calling `channels.testing.ApplicationCommunicator.wait()` but as I see it 
>> from the code it waits the application/consumer to finish, i.e. to raise 
>> `StopConsumer` exception. Probably you can share some recommendations. 
>> Thanks in advance.
>>
>> Best regards,
>> Alexander.
>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Django users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to django-users...@googlegroups.com .
>> To post to this group, send email to django...@googlegroups.com 
>> .
>> Visit this group at https://groups.google.com/group/django-users.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/django-users/00e6398b-a71f-4509-a95b-3ced88b26ee0%40googlegroups.com
>>  
>> <https://groups.google.com/d/msgid/django-users/00e6398b-a71f-4509-a95b-3ced88b26ee0%40googlegroups.com?utm_medium=email&utm_source=footer>
>> .
>> For more options,

Re: Channels 2: Consumer lifecycle when run in a worker process

2018-05-02 Thread Alexander Prokhorov

Indeed, that is exactly what I am doing - run processing in the background 
with

task = asyncio.ensure_future(database_sync_to_async(self._process)(message))

and then keep track of that running tasks. Actually, I am quite happy with 
this except one thing. I would like to limit the number if messages 
processed at the same time by a single worker. The logic is straightforward 
- once I start a new task I check if the "tasks per worker limit" has 
reached and if so - just invoke

asyncio.wait(self._tasks, return_when=asyncio.FIRST_COMPLETED)

so I block the consumer's message handler until some task finishes. I did 
not make experiments, but from the worker's code 
<https://github.com/django/channels/blob/4500a4252c8459eebe8922533a1a3dd04f1c6e9d/channels/worker.py#L32>
 
I can conclude that worker will anyway extract messages from the channel 
and put them into the application's queue which (according to this this 
<https://github.com/django/asgiref/blob/2e92fe245620332a6e57d2e3d1342839758e64b8/asgiref/server.py#L86>)
 has 
unlimited size. And that is what bothers me the most. My logic is simple, 
if a worker has already reached the "tasks per worker" limit, then I do not 
want this worker to extract message from the channel, cause there is 
probably another worker process willing to process it. Frankly, I do not 
understand how to achieve this... sorry for bothering, but probably you 
have some bright idea, please?

Anyway, thank you for explanation you have already gave me, it helps, and 
it is always pleasure to chat with you :-)

среда, 2 мая 2018 г., 22:35:08 UTC+3 пользователь Andrew Godwin написал:
>
> Ah, my apologies - you are entirely right, the scope is the same so it 
> will re-use a single existing instance, which means that it will process 
> messages synchronously and lose them, as you suggested.
>
> Using sync_to_async won't help as, while it runs in a threadpool, it also 
> blocks the coroutine until that thread completes.
>
> Without modifying the underlying worker implementation, the best way to 
> process things in parallel would be to spin off things into their own 
> coroutines within your handler - either manually, using 
> EventLoop.create_task, or I guess you could slew it out into 
> different-named channels.
>
> Andrew
>
> On Wed, May 2, 2018 at 11:51 AM Alexander Prokhorov  > wrote:
>
>> Andrew, thank you for quick response!
>>
>> Unfortunately I see something different. If I raise `StopConsumer` after 
>> processing a single message, the next one is lost. From the code 
>> https://github.com/django/asgiref/blob/master/asgiref/server.py#L75 I 
>> see that `get_or_create_application_instance` returns existing 
>> application instance, so when message arrives it is put into the queue of 
>> the existing application. If I raise `StopConsumer` this application got 
>> killed and the message is lost.
>>
>> Another observation: when message handler awaits something (like in my 
>> snippet `await some_process()`) this application instance does not 
>> process new messages. When one message handler function exits - the next 
>> one starts.
>>
>> Actually, I was offloading message processing to the threadpool (using `
>> sync_to_async`) and trying to limit the number of messages being 
>> processed at the same time. I hoped that if I await for some coroutine the 
>> worker will stop accepting messages, so other workers can process them 
>> until this one gets free again. Can you give me a clue how to achieve this 
>> bevaviour?
>>
>>
>> среда, 2 мая 2018 г., 18:39:01 UTC+3 пользователь Andrew Godwin написал:
>>>
>>>
>>>- Will `MyConsumer` receive new `wakeup` messages while awaiting `
>>>some_process`?
>>>
>>> Yes. The worker server runs as many application instances as there are 
>>> messages coming in, even though they all have the same scope. You can see 
>>> the main listening loop here: 
>>> https://github.com/django/channels/blob/master/channels/worker.py#L32
>>>
>>>- When do I need to raise `StopConsumer`? I can do it after each 
>>>processing of `wakeup` message (like in the code above) is that 
>>>correct? What will happen with all the `pending` messages in such 
>>>    case?
>>>
>>> You need to raise it when the specific application instance you have is 
>>> completed - because there's a different instance for each message, that 
>>> means "raise when you've finished processing the message". Nothing happens 
>>> to other messages as they're being handled by different instances.
>>>
&g

Re: Channels 2: Consumer lifecycle when run in a worker process

2018-05-02 Thread Alexander Prokhorov
Thank you very much!

четверг, 3 мая 2018 г., 1:07:22 UTC+3 пользователь Andrew Godwin написал:
>
> At the level of abstraction of the Worker, you can't prevent it pulling 
> more messages off the queue - if you want that level of control, you would 
> have to subclass it and change the logic yourself, I imagine.
>
> Andrew
>
> On Wed, May 2, 2018 at 1:12 PM Alexander Prokhorov  > wrote:
>
>>
>> Indeed, that is exactly what I am doing - run processing in the 
>> background with
>>
>> task = asyncio.ensure_future(database_sync_to_async(self._process)(
>> message))
>>
>> and then keep track of that running tasks. Actually, I am quite happy 
>> with this except one thing. I would like to limit the number if messages 
>> processed at the same time by a single worker. The logic is straightforward 
>> - once I start a new task I check if the "tasks per worker limit" has 
>> reached and if so - just invoke
>>
>> asyncio.wait(self._tasks, return_when=asyncio.FIRST_COMPLETED)
>>
>> so I block the consumer's message handler until some task finishes. I did 
>> not make experiments, but from the worker's code 
>> <https://github.com/django/channels/blob/4500a4252c8459eebe8922533a1a3dd04f1c6e9d/channels/worker.py#L32>
>>  
>> I can conclude that worker will anyway extract messages from the channel 
>> and put them into the application's queue which (according to this this 
>> <https://github.com/django/asgiref/blob/2e92fe245620332a6e57d2e3d1342839758e64b8/asgiref/server.py#L86>)
>>  has 
>> unlimited size. And that is what bothers me the most. My logic is simple, 
>> if a worker has already reached the "tasks per worker" limit, then I do not 
>> want this worker to extract message from the channel, cause there is 
>> probably another worker process willing to process it. Frankly, I do not 
>> understand how to achieve this... sorry for bothering, but probably you 
>> have some bright idea, please?
>>
>> Anyway, thank you for explanation you have already gave me, it helps, and 
>> it is always pleasure to chat with you :-)
>>
>> среда, 2 мая 2018 г., 22:35:08 UTC+3 пользователь Andrew Godwin написал:
>>>
>>> Ah, my apologies - you are entirely right, the scope is the same so it 
>>> will re-use a single existing instance, which means that it will process 
>>> messages synchronously and lose them, as you suggested.
>>>
>>> Using sync_to_async won't help as, while it runs in a threadpool, it 
>>> also blocks the coroutine until that thread completes.
>>>
>>> Without modifying the underlying worker implementation, the best way to 
>>> process things in parallel would be to spin off things into their own 
>>> coroutines within your handler - either manually, using 
>>> EventLoop.create_task, or I guess you could slew it out into 
>>> different-named channels.
>>>
>>> Andrew
>>>
>>> On Wed, May 2, 2018 at 11:51 AM Alexander Prokhorov  
>>> wrote:
>>>
>>>> Andrew, thank you for quick response!
>>>>
>>>> Unfortunately I see something different. If I raise `StopConsumer` 
>>>> after processing a single message, the next one is lost. From the code 
>>>> https://github.com/django/asgiref/blob/master/asgiref/server.py#L75 I 
>>>> see that `get_or_create_application_instance` returns existing 
>>>> application instance, so when message arrives it is put into the queue of 
>>>> the existing application. If I raise `StopConsumer` this application 
>>>> got killed and the message is lost.
>>>>
>>>> Another observation: when message handler awaits something (like in my 
>>>> snippet `await some_process()`) this application instance does not 
>>>> process new messages. When one message handler function exits - the next 
>>>> one starts.
>>>>
>>>> Actually, I was offloading message processing to the threadpool (using `
>>>> sync_to_async`) and trying to limit the number of messages being 
>>>> processed at the same time. I hoped that if I await for some coroutine the 
>>>> worker will stop accepting messages, so other workers can process them 
>>>> until this one gets free again. Can you give me a clue how to achieve this 
>>>> bevaviour?
>>>>
>>>>
>>>> среда, 2 мая 2018 г., 18:39:01 UTC+3 пользователь Andrew Godwin написал:
>>>>>
>>>>>
>>>>>- Will

Re: Django-channels and JSON-RPC

2017-01-22 Thread Alexander Prokhorov
If you are going to implement JSON-RPC based on Channels, I would be happy 
to participate, I suppose we will start doing this in a few week anyway.

пятница, 20 января 2017 г., 13:36:49 UTC+3 пользователь Fabien Millerand 
написал:
>
> Thanks a lot for your answer Andrew.
>
> On a side note, would you be related to Mike Godwin? I just draw some 
> moustaches on Trump's face 5 minutes before seeing your message... It is 
> disturbing...
>
>
>
>
> Le jeudi 19 janvier 2017 19:27:10 UTC+1, Andrew Godwin a écrit :
>>
>> I haven't seen anything like that personally, but I also don't see all of 
>> the Channels stuff going on, so maybe there is one.
>>
>> It would be relatively easy to implement as a single class-based consumer 
>> that dispatches to RPC handlers based on method name, though, as it matches 
>> the consumer pattern very well.
>>
>> Andrew
>>
>> On Thu, Jan 19, 2017 at 7:34 AM, Fabien Millerand  
>> wrote:
>>
>>> Hi everyone,
>>>
>>> I am looking to implement a websocket server based on Django using 
>>> JSON-RPC protocol.
>>>
>>> I have been looking around for a pre-made solution without success. I am 
>>> also a newbie in Django so I am a little bit lost...
>>>
>>> Did anyone try to develop something like that?
>>>
>>>
>>>  
>>>
>>> -- 
>>> You received this message because you are subscribed to the Google 
>>> Groups "Django users" group.
>>> To unsubscribe from this group and stop receiving emails from it, send 
>>> an email to django-users...@googlegroups.com.
>>> To post to this group, send email to django...@googlegroups.com.
>>> Visit this group at https://groups.google.com/group/django-users.
>>> To view this discussion on the web visit 
>>> https://groups.google.com/d/msgid/django-users/f8fdc9b4-ef54-4e39-ad66-9f5b4d9eb042%40googlegroups.com
>>>  
>>> 
>>> .
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/bb455b33-2725-4b0b-874a-8091a096fafd%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Django-channels and JSON-RPC

2017-01-24 Thread Alexander Prokhorov
My github name is “prokher”. Would be happy to help.

вторник, 24 января 2017 г., 11:00:06 UTC+3 пользователь Fabien Millerand 
написал:
>
> Hi,
>
> I will start implementing something this week (starting today).
> I will let you know how things go.
> Alexander, I would be happy to share my work and get some feedback/help on 
> this project. Let me know your github username and we can colaborate
>
> Cheers
> Fab
>
> Le dimanche 22 janvier 2017 15:18:03 UTC+1, Alexander Prokhorov a écrit :
>>
>> If you are going to implement JSON-RPC based on Channels, I would be 
>> happy to participate, I suppose we will start doing this in a few week 
>> anyway.
>>
>> пятница, 20 января 2017 г., 13:36:49 UTC+3 пользователь Fabien Millerand 
>> написал:
>>>
>>> Thanks a lot for your answer Andrew.
>>>
>>> On a side note, would you be related to Mike Godwin? I just draw some 
>>> moustaches on Trump's face 5 minutes before seeing your message... It is 
>>> disturbing...
>>>
>>>
>>>
>>>
>>> Le jeudi 19 janvier 2017 19:27:10 UTC+1, Andrew Godwin a écrit :
>>>>
>>>> I haven't seen anything like that personally, but I also don't see all 
>>>> of the Channels stuff going on, so maybe there is one.
>>>>
>>>> It would be relatively easy to implement as a single class-based 
>>>> consumer that dispatches to RPC handlers based on method name, though, as 
>>>> it matches the consumer pattern very well.
>>>>
>>>> Andrew
>>>>
>>>> On Thu, Jan 19, 2017 at 7:34 AM, Fabien Millerand  
>>>> wrote:
>>>>
>>>>> Hi everyone,
>>>>>
>>>>> I am looking to implement a websocket server based on Django using 
>>>>> JSON-RPC protocol.
>>>>>
>>>>> I have been looking around for a pre-made solution without success. I 
>>>>> am also a newbie in Django so I am a little bit lost...
>>>>>
>>>>> Did anyone try to develop something like that?
>>>>>
>>>>>
>>>>>  
>>>>>
>>>>> -- 
>>>>> You received this message because you are subscribed to the Google 
>>>>> Groups "Django users" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send 
>>>>> an email to django-users...@googlegroups.com.
>>>>> To post to this group, send email to django...@googlegroups.com.
>>>>> Visit this group at https://groups.google.com/group/django-users.
>>>>> To view this discussion on the web visit 
>>>>> https://groups.google.com/d/msgid/django-users/f8fdc9b4-ef54-4e39-ad66-9f5b4d9eb042%40googlegroups.com
>>>>>  
>>>>> <https://groups.google.com/d/msgid/django-users/f8fdc9b4-ef54-4e39-ad66-9f5b4d9eb042%40googlegroups.com?utm_medium=email&utm_source=footer>
>>>>> .
>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>
>>>>
>>>>

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/5df57562-9398-4ca9-b09b-92dd6c359132%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Django-channels and JSON-RPC

2017-01-25 Thread Alexander Prokhorov
Colleagues,

you are really fast :) How can I help you? For our project we will 
definitely need a JavaScript client. Quick googling led me to 
https://github.com/JsCommunity/jsonrpc-websocket-client it does not look 
mature, but such client is quite easy to implement. Do you have plans to 
include simple JavaScript client to the package? If so I could try to start 
doing one right now.

среда, 25 января 2017 г., 11:57:35 UTC+3 пользователь Fabien Millerand 
написал:
>
> Fair enough. I understand that in distributed system. But maybe you should 
> add a note about that, as if the whole system is not distributed over 
> network(s), it is highly unlikely to lose frames :D
>
>
>
> On 25 Jan 2017, at 09:08, Andrew Godwin > 
> wrote:
>
> Yes, it's a bit alarmist if you don't come from the background of writing 
> distributed systems. I just don't like to hide the truth one bit!
>
> All your software and hardware can fail in myriad ways; I have a talk I 
> need to give about it at some point. Knowing how it fails is half the 
> battle!
>
> Andrew
>
> On Wed, Jan 25, 2017 at 12:06 AM, Fabien Millerand  > wrote:
>
>> Ok, I start to understand now.
>> To be frank the docs are a bit alarming :) 
>>
>>
>>
>> Le mercredi 25 janvier 2017 08:47:06 UTC+1, Andrew Godwin a écrit :
>>>
>>>
> I am not sure to understand. In which case can there be 
 messages/frames lost?! Where does that happen? Between the server 
 interface 
 and the Django layer? I would need to know more about that... Otherwise I 
 might need to move with uWSGI or something JSON-RPC in itself doesn't 
 implement a timeout, althought the javascript client better have one...


>>> It simply means that it's possible that you might lose an incoming 
>>> frame. This is also true of implementing it in uWSGI (the process handling 
>>> the socket might get OOM killed, or the server might die, etc.)
>>>
>>> It's not a normal case, it's just that if something super bad happens, 
>>> the resulting handling is to drop a message rather than play it twice. Most 
>>> systems I know of that handle websockets do this.
>>>
>>> Andrew 
>>>
>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Django users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to django-users...@googlegroups.com .
>> To post to this group, send email to django...@googlegroups.com 
>> .
>> Visit this group at https://groups.google.com/group/django-users.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/django-users/97e75375-6caf-4e8d-a781-be6da421840d%40googlegroups.com
>>  
>> 
>> .
>>
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>
> -- 
> You received this message because you are subscribed to a topic in the 
> Google Groups "Django users" group.
> To unsubscribe from this topic, visit 
> https://groups.google.com/d/topic/django-users/b00Ie8wBPnc/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to 
> django-users...@googlegroups.com .
> To post to this group, send email to django...@googlegroups.com 
> .
> Visit this group at https://groups.google.com/group/django-users.
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/django-users/CAFwN1uoPw%2BTjz3dmTvLeOck%3DHfRfRycA%3DHZ_GQa%2BbYBt7oXxwA%40mail.gmail.com
>  
> 
> .
> For more options, visit https://groups.google.com/d/optout.
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/95904418-783e-44b7-91fc-f983b98acabb%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Django-channels and JSON-RPC

2017-01-26 Thread Alexander Prokhorov
Dear Colleagues,

I've made some fixes in the code (most of them concerns Python 3 
compatibility). I also added a test showing a problem with some kind of 
name clash. 

пятница, 27 января 2017 г., 1:07:38 UTC+3 пользователь Fabien Millerand 
написал:
>
> Yes, that was my thought as well...
>
> Will do and let you know. It could be good to add a link in your docs  as 
> a side note...
>
> If you get any feedback let me know...
>
> El 26 ene. 2017 22:38, "Andrew Godwin" > 
> escribió:
>
>> A quick read through and it looks roughly how I expect; I'm not an expert 
>> in JSON-RPC though so I'd want some other people to chime in.
>>
>> As for releasing it on PyPI, I think that's the best way; I don't want to 
>> roll something like this into Channels directly. I would suggest you turn 
>> it from a Django app into a simple Python module though, so you can just do:
>>
>>
>> from channels_jsonrpc import JsonRpcWebsocketConsumer
>>
>> class MyConsumer(JsonRpcWebsocketConsumer):
>>
>>
>>
>> In your own consumers file.
>>
>> Andrew
>>
>> On Thu, Jan 26, 2017 at 12:14 PM, Fabien Millerand > > wrote:
>>
>>> Andrew,
>>>
>>> I have finished to develop what I called the JsonRpcWebsocketConsumer:
>>>
>>>
>>> https://github.com/millerf/django-channels-jsonrpc/tree/master/django_channels_jsonrpc/django_channels_jsonrpc
>>>
>>> I was thinking of creating a pypy package, there is a little bit f more 
>>> work to be done for that. But if you want it for your next release it is 
>>> pretty much standalone. There is an example provided and plenty of tests.
>>>
>>> Let me know what you guys think, and if you see anything to be 
>>> modified/added.
>>>
>>> Cheers.
>>>
>>> Le mercredi 25 janvier 2017 09:09:48 UTC+1, Andrew Godwin a écrit :

 Yes, it's a bit alarmist if you don't come from the background of 
 writing distributed systems. I just don't like to hide the truth one bit!

 All your software and hardware can fail in myriad ways; I have a talk I 
 need to give about it at some point. Knowing how it fails is half the 
 battle!

 Andrew

 On Wed, Jan 25, 2017 at 12:06 AM, Fabien Millerand  
 wrote:

> Ok, I start to understand now.
> To be frank the docs are a bit alarming :) 
>
>
>
> Le mercredi 25 janvier 2017 08:47:06 UTC+1, Andrew Godwin a écrit :
>>
>>
 I am not sure to understand. In which case can there be 
>>> messages/frames lost?! Where does that happen? Between the server 
>>> interface 
>>> and the Django layer? I would need to know more about that... Otherwise 
>>> I 
>>> might need to move with uWSGI or something JSON-RPC in itself 
>>> doesn't 
>>> implement a timeout, althought the javascript client better have one...
>>>
>>>
>> It simply means that it's possible that you might lose an incoming 
>> frame. This is also true of implementing it in uWSGI (the process 
>> handling 
>> the socket might get OOM killed, or the server might die, etc.)
>>
>> It's not a normal case, it's just that if something super bad 
>> happens, the resulting handling is to drop a message rather than play it 
>> twice. Most systems I know of that handle websockets do this.
>>
>> Andrew 
>>
> -- 
> You received this message because you are subscribed to the Google 
> Groups "Django users" group.
> To unsubscribe from this group and stop receiving emails from it, send 
> an email to django-users...@googlegroups.com.
> To post to this group, send email to django...@googlegroups.com.
> Visit this group at https://groups.google.com/group/django-users.
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/django-users/97e75375-6caf-4e8d-a781-be6da421840d%40googlegroups.com
>  
> 
> .
>
> For more options, visit https://groups.google.com/d/optout.
>

 -- 
>>> You received this message because you are subscribed to the Google 
>>> Groups "Django users" group.
>>> To unsubscribe from this group and stop receiving emails from it, send 
>>> an email to django-users...@googlegroups.com .
>>> To post to this group, send email to django...@googlegroups.com 
>>> .
>>> Visit this group at https://groups.google.com/group/django-users.
>>> To view this discussion on the web visit 
>>> https://groups.google.com/d/msgid/django-users/49dbc038-46e5-402c-a810-900d47a561bf%40googlegroups.com
>>>  
>>> 
>>> .
>>>
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>> -- 
>> You received this message because you are subscribed to a topic in the 
>> Google Groups "Django users" group.
>> To uns

Attach consumer to dynamically created single-reader channel (Django Channels)

2017-03-19 Thread Alexander Prokhorov
Dear Colleagues,

I am playing with Channels for some time already.

Now, I am trying to establish "sticky" channel to one of available workers. 
I thought I can create single-reader channel from consumer and somehow fix 
the routing to make this particular worker handle this single-reader 
channel. I can use

channels.channel_layers[channels.DEFAULT_CHANNEL_LAYER].new_channel(
"my_exclusive_channel?")

in consumer core to create single-reader channel, but I cannot understand 
how to setup routing properly in such case. I tried to implement router 
object from scratch, but as I see its method `channel_names` invoked only 
one time per worker. In other words, the problems reduces to the question 
if it is possible to attach some consumer to dynamically created 
single-reader channel.


-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/be4c787f-1abc-4ece-8b74-ad13ce7180eb%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Attach consumer to dynamically created single-reader channel (Django Channels)

2017-04-25 Thread Alexander Prokhorov
Dear Andrew,

thank you for advice. Now I am starting implementing ASGI worked based on 
one included in Channels. I would like to take a look at example of how to 
create and use single-reader channels properly, so I took Channels, 
asgi_redis and Daphne code base and did not found any place where such 
channels are created. In particular, I hoped to find the place where Daphne 
creates `http.request.body_channel` to handle large HTTP requests. To my 
surprise I did not found any `body_channel` mentions in the Daphne code 
except a single test, which simply checks the message for `body_channel` 
optional field. Do I miss something, or Daphne simply does not implement 
this concept and therefor cannot handle large requests sequentially by 
chunking them?

воскресенье, 19 марта 2017 г., 21:18:07 UTC+3 пользователь Andrew Godwin 
написал:
>
> Workers can't dynamically change their channels during runtime, but you 
> can tell workers to only run specific channels in one of two ways:
>
>  - Have different routing configurations and different settings files that 
> point to each for the two worker groups
>  - Use the --only-channels and --exclude-channels arguments to runworker 
> to restrict which channels it will process.
>
> You can still only do this once at the start of the application, though. 
> There's no way to dynamically vary what a worker wants at runtime because 
> of how the Channels framework is written and integrated into Django; you'd 
> have to do your own ASGI worker class from scratch if you wanted that.
>
> Andrew
>
> On Sun, Mar 19, 2017 at 5:57 AM, Alexander Prokhorov  > wrote:
>
>> Dear Colleagues,
>>
>> I am playing with Channels for some time already.
>>
>> Now, I am trying to establish "sticky" channel to one of available 
>> workers. I thought I can create single-reader channel from consumer and 
>> somehow fix the routing to make this particular worker handle this 
>> single-reader channel. I can use
>>
>> channels.channel_layers[channels.DEFAULT_CHANNEL_LAYER].new_channel(
>> "my_exclusive_channel?")
>>
>> in consumer core to create single-reader channel, but I cannot understand 
>> how to setup routing properly in such case. I tried to implement router 
>> object from scratch, but as I see its method `channel_names` invoked only 
>> one time per worker. In other words, the problems reduces to the question 
>> if it is possible to attach some consumer to dynamically created 
>> single-reader channel.
>>
>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Django users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to django-users...@googlegroups.com .
>> To post to this group, send email to django...@googlegroups.com 
>> .
>> Visit this group at https://groups.google.com/group/django-users.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/django-users/be4c787f-1abc-4ece-8b74-ad13ce7180eb%40googlegroups.com
>>  
>> <https://groups.google.com/d/msgid/django-users/be4c787f-1abc-4ece-8b74-ad13ce7180eb%40googlegroups.com?utm_medium=email&utm_source=footer>
>> .
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/f570799c-a6d7-4a15-bdb2-68b08cbe7c66%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.