[moving to -hackers]

"Tom Lane" <[EMAIL PROTECTED]> writes:

> "Tomasz Rybak" <[EMAIL PROTECTED]> writes:
>> I tried to use COPY to import 27M rows to table:
>> CREATE TABLE sputnik.ccc24 (
>>         station CHARACTER(4) NOT NULL REFERENCES sputnik.station24 (id),
>>         moment INTEGER NOT NULL,
>>         flags INTEGER NOT NULL
>> ) INHERITS (sputnik.sputnik);
>> COPY sputnik.ccc24(id, moment, station, strength, sequence, flags)
>> FROM '/tmp/24c3' WITH DELIMITER AS ' ';
>
> This is expected to take lots of memory because each row-requiring-check
> generates an entry in the pending trigger event list.  Even if you had
> not exhausted memory, the actual execution of the retail checks would
> have taken an unreasonable amount of time.  The recommended way to do
> this sort of thing is to add the REFERENCES constraint *after* you load
> all the data; that'll be a lot faster in most cases because the checks
> are done "in bulk" using a JOIN rather than one-at-a-time.

Hm, it occurs to me that we could still do a join against the pending event
trigger list... I wonder how feasible it would be to store the pending trigger
event list in a temporary table instead of in ram.

-- 
  Gregory Stark
  EnterpriseDB          http://www.enterprisedb.com
  Ask me about EnterpriseDB's On-Demand Production Tuning

-- 
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to