On Tue, 19 Oct 2021 at 23:20, Vijaykumar Jain <
vijaykumarjain.git...@gmail.com> wrote:

>
> On Tue, 19 Oct 2021 at 23:09, Vijaykumar Jain <
> vijaykumarjain.git...@gmail.com> wrote:
>
>>
>> On Tue, 19 Oct 2021 at 22:45, Saurav Sarkar <saurav.sark...@gmail.com>
>> wrote:
>>
>>> Hi All,
>>>
>>>
>>> A basic question on handling large number of concurrent requests on DB.
>>>
>>> I have a cloud service which can get large of requests which will
>>> obviously trigger the db operations.
>>>
>>> Every db will have some max connection limit which can get exhausted on
>>> large number of requests.
>>>
>>> I know db connection pooling can be used to reuse the connections but it
>>> will not help when there are large number of active concurrent connections.
>>> My queries are already optimised and short living.
>>>
>>> For that i need some queuing mechanism like pgbouncer for postgres
>>> https://www.percona.com/blog/2021/02/26/connection-queuing-in-pgbouncer-is-it-a-magical-remedy/
>>>
>>> pgbounder i understand is a proxy which needs to be separately installed
>>> on the web or db server.
>>>
>>> I was thinking if the normal client side db connection pooling libraries
>>> like Apache DBCP , can also provide similar connection queuing while
>>> running in the application runtime.
>>>
>>
>>
>
also pls checkout, i forgot to link early on
Number Of Database Connections - PostgreSQL wiki
<https://wiki.postgresql.org/wiki/Number_Of_Database_Connections>
it explains the reasons, too many direct connections may result in
performance issues.

Reply via email to