Re: [HACKERS] Practical error logging for very large COPY statements

2005-11-22 Thread Christopher Kings-Lynne
Seems similar to the pgloader project on pgfoundry.org. It is similar and good, but I regard that as a workaround rather than the way forward. Yes, your way would be rad :) ---(end of broadcast)--- TIP 1: if posting/reading through Usenet, ple

Re: [HACKERS] Practical error logging for very large COPY statements

2005-11-21 Thread Christopher Kings-Lynne
Seems similar to the pgloader project on pgfoundry.org. Chris Simon Riggs wrote: If you've ever loaded 100 million rows, you'll know just how annoying it is to find that you have a duplicate row somewhere in there. Experience shows that there is always one, whatever oath the analyst swears befo

Re: [HACKERS] Practical error logging for very large COPY statements

2005-11-21 Thread Andrew Dunstan
Tom Lane wrote: Simon Riggs <[EMAIL PROTECTED]> writes: What I'd like to do is add an ERRORTABLE clause to COPY. The main problem is how we detect a duplicate row violation, yet prevent it from aborting the transaction. If this only solves the problem of duplicate keys, and not any

Re: [HACKERS] Practical error logging for very large COPY statements

2005-11-21 Thread Tom Lane
Simon Riggs <[EMAIL PROTECTED]> writes: > What I'd like to do is add an ERRORTABLE clause to COPY. The main > problem is how we detect a duplicate row violation, yet prevent it from > aborting the transaction. If this only solves the problem of duplicate keys, and not any other kind of COPY error,