Re: [GENERAL] exceptionally large UPDATE

2010-10-29 Thread Ivan Sergio Borgonovo
On Fri, 29 Oct 2010 10:21:14 -0400 Vick Khera wrote: > On Thu, Oct 28, 2010 at 1:06 PM, Ivan Sergio Borgonovo > wrote: > > What I'm planning to do is: > > max_connections = 5 > > shared_buffers = 240M > > work_mem = 90MB > > maintenance_work_mem = 1GB > > max_fsm_pages = 437616 > > max_fsm_relat

Re: [GENERAL] exceptionally large UPDATE

2010-10-29 Thread Vick Khera
On Thu, Oct 28, 2010 at 1:06 PM, Ivan Sergio Borgonovo wrote: > What I'm planning to do is: > max_connections = 5 > shared_buffers = 240M > work_mem = 90MB > maintenance_work_mem = 1GB > max_fsm_pages = 437616 > max_fsm_relations = 1200 > checkpoint_segments = 70 > default_statistics_target = 30 >

Re: [GENERAL] exceptionally large UPDATE

2010-10-28 Thread Ivan Sergio Borgonovo
On Thu, 28 Oct 2010 08:58:34 -0400 Vick Khera wrote: > On Wed, Oct 27, 2010 at 10:26 PM, Ivan Sergio Borgonovo > wrote: > > I'm increasing maintenance_work_mem to 180MB just before > > recreating the gin index. Should it be more? > > > > You can do this on a per-connection basis; no need to alt

Re: [GENERAL] exceptionally large UPDATE

2010-10-28 Thread Vick Khera
On Wed, Oct 27, 2010 at 10:26 PM, Ivan Sergio Borgonovo wrote: > I'm increasing maintenance_work_mem to 180MB just before recreating > the gin index. Should it be more? > You can do this on a per-connection basis; no need to alter the config file. At the psql prompt (or via your script) just exe

Re: [GENERAL] exceptionally large UPDATE

2010-10-27 Thread Rob Sargent
Ivan Sergio Borgonovo wrote: I've to make large UPDATE to a DB. The largest UPDATE involve a table that has triggers and a gin index on a computed tsvector. The table is 1.5M records with about 15 fields of different types. I've roughly 2.5-3Gb of ram dedicated to postgres. UPDATE queries are

[GENERAL] exceptionally large UPDATE

2010-10-27 Thread Ivan Sergio Borgonovo
I've to make large UPDATE to a DB. The largest UPDATE involve a table that has triggers and a gin index on a computed tsvector. The table is 1.5M records with about 15 fields of different types. I've roughly 2.5-3Gb of ram dedicated to postgres. UPDATE queries are simple, few of them use join and