On Tue, Mar 24, 2015 at 11:19 AM, Steven D'Aprano <steve+comp.lang.pyt...@pearwood.info> wrote: > It is pretty inefficient, but it is a good toy example of recursion. It's > also a good example of how *not* to write the Fibonacci series in practice, > what is mathematically straightforward is not always computationally > efficient. > > The question is, just how inefficient is is? How many calls to `fib` are > made in calling fib(n)? > > Answer to follow.
Exact answer not particularly important (though fun to calculate). Practical answer: A *lot*. :) I've never thought of the mathematical definition as being inherently recursive; but as inherently sequential. Sure, you can define counting numbers based on sets (0 is the cardinality of the empty set, 1 is the cardinality of the set containing the empty set, et-set-era), but most people will define them by counting. The Fibonacci sequence is logically defined by starting somewhere and then adding to the sequence. Yes, you can define it recursively too, but it's just as easy to define it as a progression. (Conversely, the squares *can* be defined as a progression - you add the next odd number to the previous square - but they're more logically defined by their indices. Some work better one way, some the other way.) Of course, mathematics works quite happily with infinity. You can construct infinite sequences and do arithmetic on them, which computers have a lot of trouble with. You can define anything in terms of anything, no matter how long a chain of definitions it takes. And above all, you can reduce problems to previously-solved states, which implies that mathematics has a concept of caching. :) ChrisA -- https://mail.python.org/mailman/listinfo/python-list