Thanks for the tips, I think a combination of the two will be my best chance
for this one.

Store the bulk on disk or in a DB and use several smaller hashes to do the
merging. After which I can retreive the bulk from disk/db while looping over
the resulting combined hash. (note to self: must not forget to dereference
those several smaller hashes as soon as they are no longer useful)


On 4/21/06, Dr.Ruud <[EMAIL PROTECTED]> wrote:
>
> Rob Coops schreef:
>
> > I have two quite large hashes each are several hundreds of MB's in
> > size, now I want to with some logic merge these into a single hash. I
> > this works of course but as one might imagine this takes quite a lot
> > of memory. And can depending on the machine simply run out of memory.
>
> What is the structure of these hashes?
>
> Does each hash contain a lot of redundancy? If so, try to 'normalize'
> them, which basically means splitting them up in even more hashes. :)
>
> How many keys? What is the minimal/average/maximal size of the values?
> How about storing the data in a database?
>
> --
> Affijn, Ruud
>
> "Gewoon is een tijger."
>
>
> --
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
> <http://learn.perl.org/> <http://learn.perl.org/first-response>
>
>
>

Reply via email to