On Mon, Oct 13, 2008 at 6:44 AM, Gregory Stark <[EMAIL PROTECTED]> wrote:
> How much memory the OS allows Postgres to allocate will depend on a lot of
> external factors. At a guess you had some other services or queries running at
> the same time the first time which reduced the available memory.
"David Wilson" <[EMAIL PROTECTED]> writes:
> create index val_datestamp_idx on vals(datestamp) tablespace space2;
>
> About 30 seconds into the query, I get:
> ERROR: out of memory
> DETAIL: Failed on request of size 536870912.
>
> Increasing maintenance_work_mem from 1GB to 2GB changed nothing
After dropping an index to do some full-table updating, I'm running
into an out of memory issue recreating one of my indices. This is on
8.3 running on linux.
The table in question has about 300m rows. The index is on a single
integer column. There are approximately 4000 unique values among the
ro