On Sep 13, 2010, at 3:07 PM, David Van Horn wrote: >> In fact, ANY method to do this (even with a vector) takes Omega(n log n) >> time. It needs to be able to produce any of n! different answers, so it >> needs to make at least log(n!) decisions, which is Theta(n log n). >> Otherwise there wouldn't be enough leaves on the decision tree. > > Doesn't it make n decisions? 1 decision for each element (the decision > being, where does the element go in the shuffled list)?
Each of these is an n-way decision, which takes log(n) bits to specify. Any algorithm that solves this problem MUST generate at least n log(n) random bits of information, and therefore MUST take at least n log(n) time (ignoring parallelism). Ryan Culpeper points out: > I think the discrepancy comes from the fact that each of those decisions > takes O(log n) bits, but we customarily pretend that "indexes" are O(1). Exactly. As long as your n fits into a machine word, and you have at least n cells of truly-random-access memory, you can treat array indexing and random-number-generation as taking O(1) time. > Is this right? Does complexity analysis have a notation for distinguishing > "true" complexity from "for up to k bits" complexity? Not exactly a "notation" AFAIK, but people do routinely talk about whether the computational model allows bit operations in O(1) time, or integer operations in O(1) time, or whatever. Stephen Bloch sbl...@adelphi.edu _________________________________________________ For list-related administrative tasks: http://lists.racket-lang.org/listinfo/users