On Mon, Oct 10, 2005 at 09:11:02AM -0400, Matt Fowles wrote:
> Patrick~
> 
> The theoretical implementation of this is quite simple.  Keep a
> counter. everytime a token is consumed from the input stream reset it.
>  Every time a rule is followed increment the counter.  If the counter
> is ever greater than the number of productions in the grammer, then
> you have gone into infinite recursion.  

What happens with something like:

    rule atom { A }
    rule binary { . <atom> | <expr> \* <atom> }
    rule expr { <binary> }

    "AB" ~~ / <expr> /

Here, the '.' in <binary> keeps "consuming" the initial "A" (thus 
resetting the counter), but the <atom> rule fails, so we take
the alternation which recurses back to <expr> and does the whole
thing all over again (with the counter having been reset to zero).
So I guess we'd need to backtrack the counter as well...?

Perhaps what we want to be watching is "positions", so that we detect
when the depth of nested subrules activated from a common input position 
is greater than the total number of defined rules?  

And, of course, in the case of rules containing closures, all bets
should be off for detecting infinite recursion, since the closure 
could be handling the termination of the recursion based on some
criteria other than tokens consumed from the input stream.  (But it's 
also probably easy to know when we've executed a closure and thus 
disable the check for infinite recursion when that happens.)

Pm

Reply via email to