valdemar pavesi added the comment:
thanks Guido and Yury
I am new on python world. I was working with automation tests, sw implemented
in Delphi in 199x.
this year I got a python certification from University Texas Arlington
University by EDX.
and I already wrote 4 projects in python3
valdemar pavesi added the comment:
thanks Guido
I will keep working with it.
I am not able to debug between socket and asyncio-read-udp.
There is no bottleneck on cpu/memory or network card.
I cannot write debug with this heavy udp load.
if I decrease the UDP per second then this problem
valdemar pavesi added the comment:
hi,
I did made a change, removing the queue and calling corotine. and now lost udp
is bigger.
def datagram_received(self, data, addr):
asyncio.ensure_future(process_report(data))
@asyncio.coroutine
def process_report(data):
tcmpdump got
valdemar pavesi added the comment:
I do understand the possibility to lose on udp.
and I am monitoring all machines involved.
I am not losing on network, I am losing between our network-card-dumpcap and
read udp from socket.
is it possible that read will be blocked and not able to read
valdemar pavesi added the comment:
I am not getting any pause, or any message about buffer full.
def pause_reading(self):
print('pause_reading')
def resume_reading(self):
print('resume_reading')
and I cannot find a way to increase the recei
valdemar pavesi added the comment:
thanks Yury,
I think we could lose inside the network, but we cannot lose inside of our
application.
regards!
Valdemar
--
___
Python tracker
<http://bugs.python.org/issue27
New submission from valdemar pavesi:
hello,
I am using asyncio to handle 165 udp-packet per second
everything received by (datagram_received) will be put into a queue.
+++
124605 UDP packets send from client.
and received by server network:
dumpcap ( filter "port 5 and len