On 9/1/2010 8:11 PM, John Bokma wrote:
Terry Reedy<tjre...@udel.edu>  writes:

On 9/1/2010 5:40 PM, John Bokma wrote:

[..]

Yes, I switched, because 'constant time' is a comprehensible claim
that can be refuted and because that is how some will interpret O(1)
(see below for proof;-).

You make it now sound as if this interpretation is not correct or out of
place.

Saying that an interpretation is comprehensible and common is hardly a putdown ;-). It simply is not unique. I also said that the other interpretation is not coherent for size-limited problems. However, if you mean 'constant', whatever you mean by that, why not say so?

Here is the technical definition given in
https://secure.wikimedia.org/wikipedia/en/wiki/Big_O_notation
I assure you that it is the standard one invented by a mathematiciam over a century ago and used by mathematicians ever since. It was popularized in computer science by Donald Knuth in 1976 and used in The Art of Computer Programming. It is related to the definition of limits.
'''
Formal definition

Let f(x) and g(x) be two functions defined on some subset of the real numbers. One writes

    f(x)=O(g(x))\mbox{ as }x\to\infty\,

if and only if, for sufficiently large values of x, f(x) is at most a constant multiplied by g(x) in absolute value. That is, f(x) = O(g(x)) if and only if there exists a positive real number M and a real number x0 such that

    |f(x)| \le \; M |g(x)|\mbox{ for all }x>x_0.
'''
For g(x) == 1, this reduces to |f(x)| < M for some M (and large x), which is to say, f(x) is bounded (at least for large x).

Hmmm. By that definition, 1/x is O(1). Let x0 be anything but 0 and M = 1/x0. Some might find that disturbing. But it is the nature of the notation, as with limit notation, that is says *nothing* about the behavior of f for small x.

> People who have bothered to read ItA will use O(1) and constant
time interchangeably while talking of the order of growth of the running
time algorithms

People who have bothered to read the Bidle will use terms the way they are used in the Bible. So what? In other words, I do not regard ItA as scripture in the way you seem to. (I presume you mean the Cormen, etal book. I partially read a library copy over 20 years ago but never bothered to buy a copy. I see it is now into its third edition.) If they disagree with Knuth and most others (which I would not claim without seeing their actual text) then so much the worse for them.

> and most of those are aware that 'big oh' hides a
constant, and that in the real world a O(log n) algorithm can outperform
an O(1) algorithm for small values of n.

Right. And if 'small values of n' include all possible values, then rejecting a particular O(log n) algorithm as 'unacceptable' relative to all O(1) algorithms is pretty absurd.

Uhm, O(1) /is/ constant time, see page 45 of Introduction to Algorithms,
2nd edition.

Given that the Wikipedia article references that section also, I wonder if it really disagrees with the definition above.

--
Terry Jan Reedy

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to