hi there,

I am trying to import large data files into pg. 
for now i used the. xarg linux command to spawn the file line for line and set  
and use the  maximum available connections. 

we use pg pool as connection pool to the database, and so try to maximize the 
concurrent data import of the file. 

problem for now that it seems to work well but we miss a line once in a while, 
and that is not acceptable. also it creates zombies ;(. 

does anybody have any other tricks that will do the job?

thanks,

Henk
-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to