On 4/7/20 2:23 PM, Sándor Daku wrote:
On Tue, 7 Apr 2020 at 21:52, David Gauthier <davegauthie...@gmail.com
<mailto:davegauthie...@gmail.com>> wrote:
After looking at some of the factors that can affect this, I think
it may be important to know that most of the connections will be
almost idle (in terms of interacting with the DB). The "users"
are perl/dbi scripts which connect to the DB and spend the vast
majority of the time doing things other than interacting with the
DB. So a connection is consumed, but it's not really working very
hard with the DB per-se. I am cleaning up some of that code by
strategically connecting/disconnecting only when a DB
interaction is required. But for my edification, is it roughly
true that 2 connections working with the DB 100% of the time is
equivalent to 20 connections @ 10% = 200 connections @ 1 % (if you
know what I mean) ?
Hi,
Every open connection consumes a bit of resources witch is not a big
deal if you keeping open a few more connections than you strictly
needed. However when you keeping a few hundred idle connections those
resources add up quickly. So don't do that if it's possible.
Likewise, establishing a new connection is resource costly process. So
don't do that either if it's possible.
Long story short, if those connections don't use many different users
then(as others already suggested) connection pooling will be the best
solution.
Regards,
Sándor
And from my experience pg_bouncer if very easy to include in your
stack. (If not tried pg_pool.)