Centuries ago, Nostradamus foresaw when [EMAIL PROTECTED] (Peter Eisentraut) would 
write:
> Is there any practical limit on the number of parallel connections that a 
> PostgreSQL server can service?  We're in the process of setting up a system 
> that will require up to 10000 connections open in parallel.  The query load 
> is not the problem, but we're wondering about the number of connections.  
> Does anyone have experience with these kinds of numbers?

We commonly have a thousand connections open, on some servers, and
while it works, we consider there to be something problematic about
it.  It tends to lead to using spinlocks a lot.

You might want to look into pgpool:
<http://www2b.biglobe.ne.jp/~caco/pgpool/index-e.html>

Jan Wieck has tried it out with his version of the TPC-W benchmark,
and found that it allowed cutting down on the _true_ number of
connections, and was very helpful in improving performance under
conditions where the application imagined it needed a lot of
connections.
-- 
(reverse (concatenate 'string "gro.gultn" "@" "enworbbc"))
http://www.ntlug.org/~cbbrowne/spiritual.html
"The last good thing written in C was Franz Schubert's Symphony number
9."  -- Erwin Dieterich

---------------------------(end of broadcast)---------------------------
TIP 2: you can get off all lists at once with the unregister command
    (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED])

Reply via email to