On Tue, May 8, 2018 at 7:44 PM, David G. Johnston <
david.g.johns...@gmail.com> wrote:

> On Tue, May 8, 2018 at 7:17 PM, tango ward <tangowar...@gmail.com> wrote:
>
>> I am trying to concatenate the value of column firstname and lastname
>> from source DB to name column of destination DB.
>>
>> for row in cur_t:
>>     cur_p.execute("""
>>                      INSERT INTO lib_author (
>>                                                  created, modified,
>> last_name,
>>                                                  first_name, country,
>>                                                  school_id, name)
>>                     VALUES (current_timestamp, current_timestamp, %s, %s,
>> %s,
>>                             (SELECT id FROM ed_school WHERE name='My Test
>> School'),
>>                              %s
>>                             )
>>                     """, (row['lastname'], row['firstname'], '',
>> (row['firstname'], row['lastname']) )
>>
>>
​Actually, what I would do looks nothing like that...

I'd use psql to \copy the relevant information out of the source DB into a
CSV file
I'd use psql to \copy the just-exported data into the target DB into a
staging (temp/unlogged) table
I'd then write, still in the psql script connected to the target machine:

INSERT INTO lib_author
SELECT ...
FROM temp_table;

DROP temp_table; (if unlogged, if its truly a temp it will drop when the
session ends)

A for-loop based migration should be a measure of last resort.  SQL is a
set-oriented language/system and you should design your processes to
leverage that.  Act on whole tables (or subsets - WHERE clauses - thereof)
at a time and not individual records.

You can access the same API via Python so you wouldn't have to use psql -
but moving csv data in bulk between the servers and performing calculations
in bulk is the way to go is this is going to be anything more than a
one-time toy project and you'll never touch a DB again.

My $0.02

David J.

​

Reply via email to