On Tue, Jun 19, 2018 at 10:17 PM Ravi Krishna <srkris...@yahoo.com> wrote:

> In order to test a real life scenario (and use it for benchmarking) I want
> to load large number of data from csv files.
> The requirement is that the load should happen like an application writing
> to the database ( that is, no COPY command).


Once you have parsed the data it is fairly easy to use PostgreSQL "COPY
FROM stdin" format. If you have all data with a tabulator separator. A
simple table (t1) could look like:

COPY t1 (f1,f2) FROM stdin;
3<tab>Joe
7<tab>Jane
\.

These data can be piped directly to psql and it will be fast.

Note: NULL should be '\N', see manual:
https://www.postgresql.org/docs/current/static/sql-copy.html

It is the same kind of data you get with pg_dump.

./hans

Reply via email to