Madhu Reddy wrote: > Hi David, > how are U... > I am using u r program for sorting... > below is u r program (at the end of mail)... > I am sorting 7.5 GB file with this program... > it has 13 millions of records... > > i changed u r program to following > > if(@buffer > 500000){ > my $tmp = "tmp" . $counter++ . > ".txt"; > ..... > } > following are the statistics... > > it took 5:30 hours to sort 13 millions record file... > on 8 CPU's and 8GB RAM > > how to improve the speed.... >
my script is single thread (process), given that you have a 8 CPU machine, you should take advantage of that: 1. split the file into multiple chunks. 2. for each chunk, create a different thread(if your version of Perl doesn't support threading, create a different process perhaps fork itself) to sort the chunks in paralle. 3. parent process wait for all threads to finish sorting. 4. parent process collect all the sorted chunks and merge them back. this should improve the speed. most of the sorting and merging code that I have writen should work without modification. david -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]