I was also thinking about adding a 'is_new' column to the table which I
would flag as 0, then do a basic copy of all the new rows in with is_new at
1. I'd then do a delete statement to delete all the rows which are
duplicate and have a flag of 0 as the copy should leave me some with two
rows, one
Ivan Sergio Borgonovo wrote:
What if you know in advance what are the row that should be inserted
and you've a batch of rows that should be updated?
Is it still the fasted system to insert them all in a temp table with
copy?
What about the one that have to be updated if you've all the columns,
On Wed, 26 Dec 2007 20:48:27 +0100
Andreas Kretschmer <[EMAIL PROTECTED]> wrote:
> blackwater dev <[EMAIL PROTECTED]> schrieb:
>
> > I have some php code that will be pulling in a file via ftp.
> > This file will contain 20,000+ records that I then need to pump
> > into the postgres db. These re
On Wed, 26 Dec 2007 20:48:27 +0100
Andreas Kretschmer <[EMAIL PROTECTED]> wrote:
> blackwater dev <[EMAIL PROTECTED]> schrieb:
>
> > I have some php code that will be pulling in a file via ftp.
> > This file will contain 20,000+ records that I then need to pump
> > into the postgres db. These re
blackwater dev <[EMAIL PROTECTED]> schrieb:
> I have some php code that will be pulling in a file via ftp. This file will
> contain 20,000+ records that I then need to pump into the postgres db. These
> records will represent a subset of the records in a certain table. I
> basically
> need an
I have some php code that will be pulling in a file via ftp. This file will
contain 20,000+ records that I then need to pump into the postgres db.
These records will represent a subset of the records in a certain table. I
basically need an efficient way to pump these rows into the table, replacin