Rob Coops schreef:

> I have two quite large hashes each are several hundreds of MB's in
> size, now I want to with some logic merge these into a single hash. I
> this works of course but as one might imagine this takes quite a lot
> of memory. And can depending on the machine simply run out of memory.

What is the structure of these hashes?

Does each hash contain a lot of redundancy? If so, try to 'normalize'
them, which basically means splitting them up in even more hashes. :)

How many keys? What is the minimal/average/maximal size of the values?
How about storing the data in a database?

-- 
Affijn, Ruud

"Gewoon is een tijger."


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to