RE: Out of memory while finding duplicate rows

2003-02-24 Thread Madhu Reddy
hanks Peter - good point > > > -Original Message- > > From: Peter Scott [mailto:[EMAIL PROTECTED] > > Sent: Sunday, February 23, 2003 5:17 AM > > To: [EMAIL PROTECTED] > > Subject: RE: Out of memory while finding duplicate > rows > > > > > > In artic

RE: Out of memory while finding duplicate rows

2003-02-23 Thread Beau E. Cox
Thanks Peter - good point > -Original Message- > From: Peter Scott [mailto:[EMAIL PROTECTED] > Sent: Sunday, February 23, 2003 5:17 AM > To: [EMAIL PROTECTED] > Subject: RE: Out of memory while finding duplicate rows > > > In article <[EMAIL PROTECTED]>

RE: Out of memory while finding duplicate rows

2003-02-23 Thread Peter Scott
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Beau E. Cox) writes: >Hi - > >Wait! If you are going to load the data into a database anyway, >why not use the existing database (or the one being created) to >remove duplicates. You don't even have to have an index on the >column you are making u

RE: Out of memory while finding duplicate rows

2003-02-23 Thread Beau E. Cox
PROTECTED] > Subject: RE: Out of memory while finding duplicate rows > > > Hi, >those data finally have to load into database.. > before loading into dabase,,we need to do some > validations like remove duplicate etc. > that is why i am doing... > > i have ano

RE: Out of memory while finding duplicate rows

2003-02-22 Thread Madhu Reddy
Hi, those data finally have to load into database.. before loading into dabase,,we need to do some validations like remove duplicate etc. that is why i am doing... i have another idea.. What i am planning is first sort the file.. once we sort the file..then it is easy to find duplicate

RE: Out of memory while finding duplicate rows

2003-02-22 Thread Beau E. Cox
Hi - > -Original Message- > From: Madhu Reddy [mailto:[EMAIL PROTECTED] > Sent: Saturday, February 22, 2003 11:12 AM > To: [EMAIL PROTECTED] > Subject: Out of memory while finding duplicate rows > > > Hi, > I have a script that will find out duplicate rows > in a file...in a file i hav