I have a table with over 1,000,000 records in it containing names and phone numbers, and one of the indexes on the table is a unique index on the phone number. I am trying to copy about 100,000 more records to the table from a text file, but I get an error on copying because of duplicate phone numbers in the text file, which kills the COPY command without copying anything to the table. Is there some way that I can get Postgres to copy the records from the file and just skip records that contain duplicates to the unique index? I found that using PHP scripts to do inserts for a file of this size take MUCH longer than I'd like, so I'd like to avoid having to do it that way if I can. Any help is appreciated. Thanks! ---------------------------(end of broadcast)--------------------------- TIP 6: Have you searched our list archives? http://www.postgresql.org/search.mpl