aurora wrote: <generator example> > Note that there are 4 calls to inorder() and 10 yield. Indeed the > complexity of traversing this kind of tree would be O(n^2)! > <recursive function> > There will be 4 calls to inorder() and 4 call to foo(), give a > reasonable O(n) performance. > > Is it an inherent issue in the use of recursive generator? Is there > any compiler optimization possible? >
Do you actually have timing evidence that this is a problem with the data you are processing? The complexity of the algorithm may be important for very large data sets, but until you time it you won't know whether it is likely to be an issue for realistic data. Your O(n^2) for the generator example is only n^2 if the tree is, as in your example, completely unbalanced. If you can keep your binary tree balanced it is n log n. If the tree is likely to become as wildly unbalanced as in your example then you should consider using a different data structure (e.g. a btree) where O(n log n) would be the worst case. If you make the more realistic comparison of n calls to foo, versus nlogn yields the question is at what point the greater number of fast yields outweighs the lesser number of slow Python function calls. Until you know speed really is an issue, go with whichever method makes the programming easier. -- http://mail.python.org/mailman/listinfo/python-list