Hello,

I was curious if there was a known size limit for Postgres transactions. In
order to import data into my database, my Java application begins a transaction,
imports the data (into several different tables), and then commits the
transaction on success. It works great on small data sets, but on the large
ones, it doesn't work so well.

About 150 million records into the import process, I get the following error:

ERROR: lock AccessShareLock on object 51533/51769/0 is already held
CONTEXT: SQL statement "INSERT INTO table_name (col1, col2, col3, col4) VALUES
(val1, val2, val3, val4)"

     PL/pgSQL function "create_import" line 19 at SQL statement
STATEMENT: select * from create_import($1,$2,$3,$4,$5,$6) as result 

I know my server can handle this much data (24GB RAM, 2 TB SAS disks, etc.), but
it doesn't seem like Postgres likes the large transactions.

Any thoughts?

Thank you for your time,

Brian Beyer
Purdue University
[EMAIL PROTECTED]

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to