"Paul Rubin" <http://[EMAIL PROTECTED]> wrote:

> "George Sakkis" <[EMAIL PROTECTED]> writes:
> > I'm trying to create a dbm database with around 4.5 million entries
> > but the existing dbm modules (dbhash, gdbm) don't seem to cut
> > it. What happens is that the more entries are added, the more time
> > per new entry is required, so the complexity seems to be much worse
> > than linear. Is this to be expected
>
> No, not expected.  See if you're using something like db.keys() which
> tries to read all the keys from the db into memory, or anything like that.

It turns out it doesn't have to do with python or the dbm modules. The same 
program on a different
box and platform runs linearly, so I guess it has to do with the OS and/or the 
hard disk
configuration.

George


-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to