hanks Peter - good point
>
> > -Original Message-
> > From: Peter Scott [mailto:[EMAIL PROTECTED]
> > Sent: Sunday, February 23, 2003 5:17 AM
> > To: [EMAIL PROTECTED]
> > Subject: RE: Out of memory while finding duplicate
> rows
> >
> >
> > In artic
Thanks Peter - good point
> -Original Message-
> From: Peter Scott [mailto:[EMAIL PROTECTED]
> Sent: Sunday, February 23, 2003 5:17 AM
> To: [EMAIL PROTECTED]
> Subject: RE: Out of memory while finding duplicate rows
>
>
> In article <[EMAIL PROTECTED]>
In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] (Beau E. Cox) writes:
>Hi -
>
>Wait! If you are going to load the data into a database anyway,
>why not use the existing database (or the one being created) to
>remove duplicates. You don't even have to have an index on the
>column you are making u
PROTECTED]
> Subject: RE: Out of memory while finding duplicate rows
>
>
> Hi,
>those data finally have to load into database..
> before loading into dabase,,we need to do some
> validations like remove duplicate etc.
> that is why i am doing...
>
> i have ano
gt; > From: Madhu Reddy [mailto:[EMAIL PROTECTED]
> > Sent: Saturday, February 22, 2003 11:12 AM
> > To: [EMAIL PROTECTED]
> > Subject: Out of memory while finding duplicate
> rows
> >
> >
> > Hi,
> > I have a script that will find out duplicate
&g
Hi -
> -Original Message-
> From: Madhu Reddy [mailto:[EMAIL PROTECTED]
> Sent: Saturday, February 22, 2003 11:12 AM
> To: [EMAIL PROTECTED]
> Subject: Out of memory while finding duplicate rows
>
>
> Hi,
> I have a script that will find out duplicate row
Hi,
I have a script that will find out duplicate rows
in a file...in a file i have 13 millions of
records
out of that not morethan 5% are duplicate
for finding duplicate i am using following function...
while () {
if (find_duplicates ()) {
$dup++
}
}
# return 1, if