When testing how many TCP sockets are you creating? Have you checked? If 
you're using a single shared client stub on a load tester app, then you're 
probably just creating many HTTP/2 streams over a single HTTP/2 connection. 
If you're having a load balancer in between like AWS ALB, then it also 
enforces a maximum 128 concurrent HTTP/2 streams per HTTP/2 connection.

Best,
Edvard

On Wednesday, October 18, 2023 at 7:57:15 PM UTC+3 pallav parikh wrote:

> Hi,
>
> I have a requirement for the use to stream the data from clients to server 
> and all the stream are long-lived session currently we are using crossbar 
> as a router and WebSockets for streaming the data, but now we are looking 
> to change the architecture. I am very much amazed by the efficiency gRPC 
> brings but in the use case we will be having 1000 clients simultaneously 
> streaming the data to server concurrently and when I tried out the scenario 
> gRPC does not accept the new connection request after certain concurrent 
> streams are going (150-200 approx). So is this a gRPC limitation and if not 
> how can I overcome this issue??
>
> Thank you.
>

-- 
You received this message because you are subscribed to the Google Groups 
"grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/grpc-io/c12ef52f-4090-4e84-81bb-86df7d9f9845n%40googlegroups.com.

Reply via email to