Hi Grag,
thank you, I realized the problem. I treated these values as per server
values not per connection. 
It's now working with 20 or more concurrent connections well. 

bye,
-- Csaba 

-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Greg Stark
Sent: Tuesday, December 14, 2004 6:25 PM
To: [EMAIL PROTECTED]
Subject: Re: [GENERAL] Insufficient memory for this operation.


Együd Csaba (Freemail) <[EMAIL PROTECTED]> writes:

> shared_buffers  = 20000         # min 16, at least max_connections*2, 8KB
each

You can lower this to 10,000 or even lower.

> max_connections = 100
> work_mem = 16384                # min 64, size in KB

That's 16M per connection with a maximum of 100 connections. So that's up to
1.6G that postgres has been told it can grab. It's unlikely it would grab it
all at once though unless lots of connections are running queries with big
sorts.

--
greg


---------------------------(end of broadcast)---------------------------
TIP 8: explain analyze is your friend

---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.805 / Virus Database: 547 - Release Date: 2004.12.03.
 

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.805 / Virus Database: 547 - Release Date: 2004.12.03.
 


---------------------------(end of broadcast)---------------------------
TIP 4: Don't 'kill -9' the postmaster

Reply via email to