On Fri, 2006-21-04 at 14:39 +0200, Rob Coops wrote: > Good day list, > > I hope there is someone out there that can point me in the right direction > to find a solution for this. > > I have two quite large hashes each are several hundreds of MB's in size, now > I want to with some logic merge these into a single hash. I this works of > course but as one might imagine this takes quite a lot of memory. And can > depending on the machine simply run out of memory. > > Until recently these hashes where quite small about 10MB each which was > quite aceptable to do in memory certainly be cause that was the easiest and > fastest way to get this done. (call me lazy) > > So my question is how can I reduce the memory footprint of this application > while keeping as much speed as posible. Any pointer in the rigth direction > would be very much appreciated. > > Regards, > > Rob
Are these hashes huge because they contain many, many elements or just a few with large sets of data? If the latter, you could try compressing the data. If not, you would have to use temporary files to hold part of them while you work on the rest. -- __END__ Just my 0.00000002 million dollars worth, --- Shawn "For the things we have to learn before we can do them, we learn by doing them." Aristotle * Perl tutorials at http://perlmonks.org/?node=Tutorials * A searchable perldoc is at http://perldoc.perl.org/ -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] <http://learn.perl.org/> <http://learn.perl.org/first-response>