Dear Yogesh,

To get best answer's from community member's you need to provide complete
information like,PG version, Server /Hardware info etc., So that it help's
member's to assist you in right way.

http://wiki.postgresql.org/wiki/Guide_to_reporting_problems

---
Regards,
Raghavendra
EnterpriseDB Corporation
Blog: http://raghavt.blogspot.com/



On Tue, Oct 18, 2011 at 7:27 PM, Deshpande, Yogesh Sadashiv (STSD-Openview)
<yogesh-sadashiv.deshpa...@hp.com> wrote:

>  Hello ,****
>
> ** **
>
> We have a setup where in there are around 100 process running in parallel
> every 5 minutes and each one of them opens a connection to database. We are
> observing that for each connection , postgre also created on sub processes.
> We have set max_connection to 100. So the number of sub process in the
> system is close to 200 every 5 minutes. And because of this we are seeing
> very high CPU usage.  We need following information****
>
> ** **
>
> **1.       **Is there any configuration we do that would pool the
> connection request rather than coming out with connection limit exceed.***
> *
>
> **2.       **Is there any configuration we do that would limit the sub
> process to some value say 50 and any request for connection would get
> queued.****
>
> ** **
>
> Basically we wanted to limit the number of processes so that client code
> doesn’t have to retry for unavailability for connection or sub processes ,
> but postgre takes care of queuing?****
>
> ** **
>
> Thanks****
>
> Yogesh****
>

Reply via email to