On 1月4日, 下午10时17分, Fredrik Lundh <[EMAIL PROTECTED]> wrote:
> wanzathe wrote:
> > i have a binary file named test.dat including 9600000 records.
> > the record format is int a + int b + int c + int d
> > i want to build a dict like this: key=int a,int b  values=int c,int d
> > i choose using bsddb and it takes about 140 seconds to build the dict.
>
> you're not building a dict, you're populating a persistent database.
> storing ~70000 records per second isn't that bad, really...
>
> > what can i do if i want to make my program run faster?
> > or is there another way i can choose?
>
> why not just use a real Python dictionary, and the marshal module for
> serialization?
>
> </F>

hi,Fredrik Lundn
you are right, i'm populating a persistent database.
i plan to use a real Python dictionary and use cPickle for
serialization at first, but it did not work because the number of
records is too large.
Thanks
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to