On Wed, 17 Apr 2013 18:33:09 -0600, Ian Kelly wrote: > On Wed, Apr 17, 2013 at 5:40 PM, Mark Janssen > <dreamingforw...@gmail.com> wrote: >> Rercursion the "bedrock" of language-design. I don't think so. From >> what I know, a well-defined language ends at its symbols. It makes no >> use of "infinities". > > From what I know, you can't have a Turing-complete language without > some form of recursion. So yeah, it's pretty damn important in language > design.
Incorrect. Early Fortran, which was definitely Turing complete, was incapable of using recursion. But that doesn't matter, since any recursive algorithm can be re-written as iteration. So long as a language can iterate an indefinite number of times, it may be Turing complete. (Languages which can only iterate a fixed number of times cannot be Turing complete.) Hell, Turing machines themselves are not recursive. Since they don't have a concept of functions, they don't have a concept of functions that call themselves. A Turing machine only has a couple of trivial operations: * read a cell * write a cell * advance the tape * change direction and so it's grammar is correspondingly trivial. Actually, talking about the grammar of a Turing machine is probably wrong. In practice, Turing machines are specified as a finite (and usually small) table of states and cells. See here for examples: http://en.wikipedia.org/wiki/Turing_machine_examples So it isn't even correct to say that recursion is necessary for a language's *grammar*. However, for any real-world practical language (there's a reason that no practical language is based on Turing machines) recursive grammars are extraordinarily useful. > A finite, non-recursive grammar can only hope to accept a finite number > of strings. To have an infinite language, the defining grammar must > then be either infinite (not practical) or recursive. I don't believe that is true, so long as the grammar has a concept of "zero or more" of some syntactic element. For example, suppose your grammar has a concept of integers, defined recursively as either a digit, or a digit followed by an integer: INTEGER ::= DIGIT | DIGIT INTEGER This can be defined more naturally as a digit followed by zero or more digits: INTEGER ::= DIGIT (DIGIT)* -- Steven -- http://mail.python.org/mailman/listinfo/python-list