Re: [PERFORM] error updating a very large table

2009-04-15 Thread Simon Riggs
On Wed, 2009-04-15 at 09:51 -0400, Tom Lane wrote: > Brian Cox writes: > > I changed the logic to update the table in 1M row batches. However, > > after 159M rows, I get: > > > ERROR: could not extend relation 1663/16385/19505: wrote only 4096 of > > 8192 bytes at block 7621407 > > You're ou

Re: [PERFORM] error updating a very large table

2009-04-15 Thread Tom Lane
Brian Cox writes: > I changed the logic to update the table in 1M row batches. However, > after 159M rows, I get: > ERROR: could not extend relation 1663/16385/19505: wrote only 4096 of > 8192 bytes at block 7621407 You're out of disk space. > A df run on this machine shows plenty of space:

Re: [PERFORM] error updating a very large table

2009-04-15 Thread Grzegorz Jaƛkiewicz
On Wed, Apr 15, 2009 at 1:41 AM, Brian Cox wrote: > ts_defect_meta_values has 460M rows. The following query, in retrospect not > too surprisingly, runs out of memory on a 32 bit postgres: > > update ts_defect_meta_values set ts_defect_date=(select ts_occur_date from > ts_defects where ts_id=ts_de

[PERFORM] error updating a very large table

2009-04-14 Thread Brian Cox
ts_defect_meta_values has 460M rows. The following query, in retrospect not too surprisingly, runs out of memory on a 32 bit postgres: update ts_defect_meta_values set ts_defect_date=(select ts_occur_date from ts_defects where ts_id=ts_defect_id) I changed the logic to update the table in 1M