Hi,

We are currently working on a project that involves a C++ GRPC server, and 
we have observed that the C++ gRPC server appears to consume considerably 
more CPU resources than the Java gRPC server under similar load conditions.

Our setup involves multi-threaded C++ applications where performance 
measurement and logging are critical. We are concerned about the 
performance impact, and after ensuring that the code and configurations are 
optimized, we still see this disparity. We wanted to reach out to the 
community to understand:

   1. *Is this a known issue* or limitation in C++ gRPC?
   2. Are there any *specific optimizations or tuning parameters* for the 
   C++ gRPC server that could help reduce CPU usage?
   3. Has anyone in the community faced similar issues, and if so, how did 
   you mitigate it?

Any insights, suggestions, or guidance would be greatly appreciated.
Thanks & Regards,
Kritika Goel

-- 
You received this message because you are subscribed to the Google Groups 
"grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to grpc-io+unsubscr...@googlegroups.com.
To view this discussion visit 
https://groups.google.com/d/msgid/grpc-io/d3859aae-154d-4afa-ac7c-a3e09b9ba5bbn%40googlegroups.com.

Reply via email to