On 7/24/2014 12:04 PM, William Nolf wrote:

This is probably an easy one for most sql users but I don't use it very often.

We have a postgres database that was used for an application we no longer use. However, we would

like to copy/dump the tables to files, text or csv so we can post them to sharepoint.

Copy seems to be what I want. But it only does one table at a time. There is 204 tables with a schemaname=public. I need to be copy the tables with data to files. I need something like a for

loop which checks if the table is populated if so, copy it to tablename.csv file

Any ideas?


what scripting/programming language are you best in? I'd probably whip up a perl script to do that.

start with something like,

select table_schema||'.'||table_name from information_schema.tables where table_schema not in ('pg_catalog','information_schema')

which will output a list of all table names.

I suppose even bash scripting...

for $table in $(psql -tc "select table_schema||'.'||table_name from information_schema.tables where table_schema not in ('pg_catalog','information_schema')" dbname; do psql -c "\copy $table to '$table.csv' with (format csv, header,...)" dbname
    done



--
john r pierce                                      37N 122W
somewhere on the middle of the left coast



--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to