On 01.11.2018 18:27, Ravi Krishna wrote:
I have a project to develop a script/tool to copy data from DB2 to PG.  The 
approach I am thinking is

1. Export data from db2 in a text file, with, say pipe as delimiter.
2. Load the data from the text file to PG using COPY command.

In order to make it faster I can parallelize export and load with upto X number 
of tables concurrently.

Is there a tool in which I can avoid first exporting and then loading.  I think 
one way I can do it is by
using pipes by which I can export from db2 to a pipe and simultaneously load it 
to PG using COPY.

Any other tool for this job?

thanks.


Haven't tried it myself, but you may be able to connect the DB2 database to your PostgreSQL cluster using this FDW module: https://github.com/wolfgangbrandl/db2_fdw

Then you could just use INSERT INTO ... SELECT  statements to do the ETL process with the necessary type conversions and whatnot.

Looks like db2_fdw is DB2 LUW only though, so you might be out of luck if your DB2 is on IBM i (or z ;-)

Kind regards
Florian


Reply via email to