On 2013-06-18, Antoine Pitrou <solip...@pitrou.net> wrote: > Roy Smith <roy <at> panix.com> writes: > > You should read again on the O(...) notation. It's an asymptotic complexity, > it tells you nothing about the exact function values at different data points. > So you can have two O(n) routines, one of which always twice faster than the > other.
And you can have two O(n) routines, one of which is twice as fast for one value of n and the other is twice as fast for a different value of n (and that's true for any value of 'twice': 2X 10X 100X). All the O() tells you is the general shape of the line. It doesn't tell you where the line is or how steep the slope is (except in the case of O(1), where you do know the slope is 0. It's perfectly feasible that for the range of values of n that you care about in a particular application, there's an O(n^2) algorithm that's way faster than another O(log(n)) algorithm. [Though that becomes a lot less likely as n gets large.] -- Grant Edwards grant.b.edwards Yow! Where's SANDY DUNCAN? at gmail.com -- http://mail.python.org/mailman/listinfo/python-list