[GENERAL] Import large data set into a table and resolve duplicates?

2015-02-16 Thread Eugene Dzhurinsky
Hello! I have a huge dictionary table with series data generated by a third-party service. The table consists of 2 columns - id : serial, primary key - series : varchar, not null, indexed From time to time I need to apply a "patch" to the dictionary, the patch file consists of "series" data, one

Re: [GENERAL] Import large data set into a table and resolve duplicates?

2015-02-15 Thread Eugene Dzhurinsky
eter. Then truncate existing dictionary table and COPY the data from the merged file into it. Is it what you've meant? Thank you! -- Eugene Dzhurinsky pgpQFsOplL4EJ.pgp Description: PGP signature

Re: [GENERAL] Import large data set into a table and resolve duplicates?

2015-02-15 Thread Eugene Dzhurinsky
ks really promising, thank you John! I need only one index on the "patch_data" table, and I will re-use the existing index on the "dictionary". Thanks again! -- Eugene Dzhurinsky pgpY2gxa8nKur.pgp Description: PGP signature

Re: [GENERAL] Import large data set into a table and resolve duplicates?

2015-02-15 Thread Eugene Dzhurinsky
uot; table on "series" column. But perhaps it's better to try this and if a performance will go really bad - then do some optimizations, like partitioning etc. Thank you! -- Eugene Dzhurinsky pgp4hwQD7LkHm.pgp Description: PGP signature

[GENERAL] Import large data set into a table and resolve duplicates?

2015-02-14 Thread Eugene Dzhurinsky
d that - the dictionary table already consists of ~200K records - the patch could be ~1-50K of records long - records could not be removed from the dictionary, only added if not exist Thanks! -- Eugene Dzhurinsky pgpsmOWtdvXCS.pgp Description: PGP signature