List,

Can anyone suggest where the below error comes from, given I'm attempting to 
load HTTP access log data with reasonably small row and column value lengths?

        logs=# COPY raw FROM '/path/to/big/log/file' DELIMITER E'\t' CSV;
        ERROR:  out of memory
        DETAIL:  Cannot enlarge string buffer containing 1073712650 bytes by 
65536 more bytes.
        CONTEXT:  COPY raw, line 613338983

It was suggested in #postgresql that I'm reaching the 1GB MaxAllocSize - but I 
would have thought this would only be a constraint against either large values 
for specific columns or for whole rows. It's worth noting that this is after 
613 million rows have already been loaded (somewhere around 100GB of data) and 
that I'm running this COPY after the "CREATE TABLE raw ..." in a single 
transaction.

I've looked at line 613338983 in the file being loaded (+/- 10 rows) and can't 
see anything out of the ordinary.

Disclaimer: I know nothing of PostgreSQL's internals, please be gentle!

Regards,
Tom


-- 
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to