what if it is a million URL's .. den what say Amol Sharma ??
in dat case even hashing and BST can be costly ..
On Sat, Jul 30, 2011 at 11:44 PM, Amol Sharma <[email protected]>wrote:

> plz be more clear....were u asked about characters or words....
> if it is characters then problem will become quite simple.......
> take an extra array of size 255.....store count of each character and
> display the characters with count greater than 1...
>
>
> --
>
>
> Amol Sharma
> Third Year Student
> Computer Science and Engineering
> MNNIT Allahabad
>
>
>
>
> On Sat, Jul 30, 2011 at 10:36 PM, himanshu kansal <
> [email protected]> wrote:
>
>> how to find a duplicate in a very large file.....
>>
>>
>> i gave two approaches....
>> Hashing
>> and buliding a BST....
>> can anyone suggest another more efficient approach.....
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "Algorithm Geeks" group.
>> To post to this group, send email to [email protected].
>> To unsubscribe from this group, send email to
>> [email protected].
>> For more options, visit this group at
>> http://groups.google.com/group/algogeeks?hl=en.
>>
>>
>  --
> You received this message because you are subscribed to the Google Groups
> "Algorithm Geeks" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/algogeeks?hl=en.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Algorithm Geeks" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/algogeeks?hl=en.

Reply via email to