"George Sakkis" <[EMAIL PROTECTED]> writes: > I'm trying to create a dbm database with around 4.5 million entries > but the existing dbm modules (dbhash, gdbm) don't seem to cut > it. What happens is that the more entries are added, the more time > per new entry is required, so the complexity seems to be much worse > than linear. Is this to be expected
No, not expected. See if you're using something like db.keys() which tries to read all the keys from the db into memory, or anything like that. -- http://mail.python.org/mailman/listinfo/python-list