Dan Stromberg <drsali...@gmail.com>: > dict was able to do 1048576 operations on a dictionary before taking > more than 120 seconds to complete - it took 75.3 seconds to do 1048576 > operations. > > AVL_tree was able to do 262144 operations on a dictionary before > taking more than 120 seconds to complete - it took 66.1 seconds to do > 262144 operations. > > blist.sorteddict was able to do 65536 operations on a dictionary > before taking more than 120 seconds to complete - it took 77.3 seconds > to do 65536 operations.
For a proper comparison, I'd like a fixed, identical dataset and set of operations run against each data structure. How about this test program: generate random datasets of 100, 10000, 1000000 and 100000000 elements generate random testset of 1000000 elements for each data structure: for each dataset: initialize data structure with dataset head, tail = testset[:100], testset[100:] t0 = current timestamp for each element in head: add element to data structure for each element in tail: add element to data structure append element to head remove head.pop(0) from data structure for each element in head: remove element from data structure t1 = current timestamp report data structure type, dataset size, t1 - t0 Marko -- https://mail.python.org/mailman/listinfo/python-list