I'm trying to create a dbm database with around 4.5 million entries but the existing dbm modules (dbhash, gdbm) don't seem to cut it. What happens is that the more entries are added, the more time per new entry is required, so the complexity seems to be much worse than linear. Is this to be expected and if so, should I expect better performance (i.e. linear or almost linear) from a real database, e.g. sqlite ?
George -- http://mail.python.org/mailman/listinfo/python-list