Re: [GENERAL] Optimizing large data loads

2005-08-06 Thread John Wells
Richard Huxton said: > You don't say what the limitations of Hibernate are. Usually you might > look to: > 1. Use COPY not INSERTs Not an option, unfortunately. > 2. If not, block INSERTS into BEGIN/COMMIT transactions of say 100-1000 We're using 50/commit...we can easily up this I suppose. > 3

Re: [GENERAL] Optimizing large data loads

2005-08-05 Thread Richard Huxton
John Wells wrote: Hi guys, We have a Java process that uses Hibernate to load approximately 14 GB of data. One a dual-proc 2.4 GHZ Xeon with 2048 MB RAM, it's currently taking over 13 hours to load (PostgreSQL 7.4.8). We're flushing from hibernate every 50 records. I've turned fsync to false

[GENERAL] Optimizing large data loads

2005-08-05 Thread John Wells
Hi guys, We have a Java process that uses Hibernate to load approximately 14 GB of data. One a dual-proc 2.4 GHZ Xeon with 2048 MB RAM, it's currently taking over 13 hours to load (PostgreSQL 7.4.8). We're flushing from hibernate every 50 records. I've turned fsync to false in postgresql.conf,